A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.sentry.io/platforms/python/integrations/openai/ below:

OpenAI | Sentry for Python

OpenAI Learn about using Sentry for OpenAI.

This integration connects Sentry with the OpenAI Python SDK .

Once you've installed this SDK, you can use Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests.

Sentry AI Monitoring will automatically collect information about prompts, tools, tokens, and models. Learn more about the AI Agents Dashboard.

Install

Install sentry-sdk from PyPI with the openai extra:

Copied

pip install "sentry-sdk[openai]"
Configure

If you have the openai package in your dependencies, the OpenAI integration will be enabled automatically when you initialize the Sentry SDK.

An additional dependency, tiktoken, is required if you want to calculate token usage for streaming chat responses.

Error Monitoring Logs Tracing Profiling

Copied

import sentry_sdk

sentry_sdk.init(
    dsn="https://examplePublicKey@o0.ingest.sentry.io/0",
    
    
    send_default_pii=True,
    
    
    
    traces_sample_rate=1.0,
    
    
    
    
    profile_session_sample_rate=1.0,
    
    
    profile_lifecycle="trace",
    
    

    
    _experiments={
        "enable_logs": True,
    },
    
)
Verify

Verify that the integration works by making a chat request to OpenAI.

Copied

import sentry_sdk
from openai import OpenAI

sentry_sdk.init(...)  

client = OpenAI(api_key="(your OpenAI key)")

def my_llm_stuff():
    with sentry_sdk.start_transaction(
        name="The result of the AI inference",
        op="ai-inference",
    ):
      print(
          client.chat.completions.create(
              model="gpt-3.5", messages=[{"role": "system", "content": "say hello"}]
          )
          .choices[0]
          .message.content
      )

After running this script, the resulting data should show up in the "AI Spans" tab on the "Explore" > "Traces" page on Sentry.io.

If you manually created an Invoke Agent Span (not done in the example above) the data will also show up in the AI Agents Dashboard.

It may take a couple of moments for the data to appear in sentry.io .

Behavior Options

By adding OpenAIIntegration to your sentry_sdk.init() call explicitly, you can set options for OpenAIIntegration to change its behavior:

Copied

import sentry_sdk
from sentry_sdk.integrations.openai import OpenAIIntegration

sentry_sdk.init(
    
    
    
    send_default_pii=True,
    integrations=[
        OpenAIIntegration(
            include_prompts=False,  
            tiktoken_encoding_name="cl100k_base",
        ),
    ],
)

You can pass the following keyword arguments to OpenAIIntegration():

Supported Versions Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4