Skip to content

Conversation

mznmel
Copy link

@mznmel mznmel commented Jun 23, 2025

The original tutorial code was inconsistent regarding autolog methods, mentioning both ⁠mlflow.dspy.autolog() and ⁠mlflow.autolog(), and the code example didn't use either. I updated the documentation and code example to use ⁠mlflow.dspy.autolog().

@okhat
Copy link
Collaborator

okhat commented Jun 23, 2025

Thanks so much @mznmel !

@chenmoneygithub
Copy link
Collaborator

@TomeHirata Is the error that litellm generates multiple trace entities fixed on the mlflow side?

Meanwhile, there is a special case that when streaming is enabled, dspy.LM combines the stream tokens into one response. So in this setup, tracing LM.forward() makes sense to me, wdyt?

@TomeHirata
Copy link
Collaborator

TomeHirata commented Jun 25, 2025

@chenmoneygithub The fix for the issue has been filed for the LiteLLM repo a while ago (BerriAI/litellm#8202), but we haven't heard back from them. Can we ping them directly?

@TomeHirata
Copy link
Collaborator

fyi: We're talking to the LiteLLM team. Please keep this PR on hold for a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants