Skip to content

Fix Ollama provider crash: _convert_message assumes objects, gets dicts after OpenAI migration #3324

@aakankshaduggal

Description

@aakankshaduggal

🚀 Describe the new functionality needed

After the OpenAI chat completions migration PR, CI is failing with:

AttributeError: 'dict' object has no attribute 'content'

This happens in providers/remote/inference/ollama/ollama.py::_convert_message, which expects Message objects but now sometimes receives OpenAI-style dicts.

Blocks #3067

💡 Why is this needed? What if we don't build it?

  1. Dict conversion happens in the new OpenAI call path.
  2. These dicts leak into Ollama, which still expects Message.
  3. Result: Ollama crashes, blocking CI and the open ai chat completions migration PR.

Other thoughts

Two options here -

  1. Make Ollama tolerant: update _convert_message to handle both dicts and objects.
  2. Contain dicts: ensure dict conversion only happens in OpenAI paths, revert global conversion.

Either path unblocks CI -- Option 1 is quickest, Option 2 is stricter.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions