Skip to content

Clarify what works OOTB for Python AI Integrations #13167

Open
@smeubank

Description

@smeubank

SDK

Python SDK

Description

Users have reported confusion regarding the clarity of our documentation for LLM Monitoring, specifically about what works out of the box versus what requires manual setup or decorators.

  • For the OpenAI Python library, it is unclear if monitoring is automatically enabled upon installation or if manual instrumentation using @ai_track is required.

  • For LangChain, users expected automatic tracking of LLM calls without explicitly initializing Sentry with the LangChain integration or using decorators. However, LLM monitoring only activates once explicitly initialized and traces only appear when decorators are used.

Suggested Solution

  • Explicitly state for the AI integrations (e.g., OpenAI, LangChain) support automatic instrumentation, and what auto instrumentation means in that context. Just errors? Tracing? LLM specific insights?

  • Provide clear examples for each integration scenario highlighting necessary setup steps.

Background

getsentry/sentry-python#3007 (comment)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions