Attention
The mlflow.tracing
namespace only contains a few utility functions fo managing traces. The main entry point for MLflow Tracing is Tracing Fluent APIs defined directly under the mlflow
namespace, or the low-level Tracing Client APIs
Disable tracing.
Note
This function sets up OpenTelemetry to use NoOpTracerProvider and effectively disables all tracing operations.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1
Disables displaying the MLflow Trace UI in notebook output cells. Call mlflow.tracing.enable_notebook_display()
to re-enable display.
Enable tracing.
Example:
import mlflow @mlflow.trace def f(): return 0 # Tracing is enabled by default f() assert len(mlflow.search_traces()) == 1 # Disable tracing mlflow.tracing.disable() f() assert len(mlflow.search_traces()) == 1 # Re-enable tracing mlflow.tracing.enable() f() assert len(mlflow.search_traces()) == 2
Enables the MLflow Trace UI in notebook output cells. The display is on by default, and the Trace UI will show up when any of the following operations are executed:
On trace completion (i.e. whenever a trace is exported)
When calling the mlflow.search_traces()
fluent API
When calling the mlflow.client.MlflowClient.get_trace()
or mlflow.client.MlflowClient.search_traces()
client APIs
To disable, please call mlflow.tracing.disable_notebook_display()
.
Reset the flags that indicates whether the MLflow tracer provider has been initialized. This ensures that the tracer provider is re-initialized when next tracing operation is performed.
Note
Experimental: This function may change or be removed in a future release without warning.
Set a custom span destination to which MLflow will export the traces.
A destination specified by this function will take precedence over other configurations, such as tracking URI, OTLP environment variables.
To reset the destination, call the mlflow.tracing.reset()
function.
destination â A TraceDestination
object that specifies the destination of the trace data.
Example
import mlflow from mlflow.tracing.destination import MlflowExperiment # Setting the destination to an MLflow experiment with ID "123" mlflow.tracing.set_destination(MlflowExperiment(experiment_id="123")) # Reset the destination (to an active experiment as default) mlflow.tracing.reset()
Set the mlflow.chat.messages attribute on the specified span. This attribute is used in the UI, and also by downstream applications that consume trace data, such as MLflow evaluate.
span â The LiveSpan to add the attribute to
messages â A list of standardized chat messages (refer to the spec for details)
append â If True, the messages will be appended to the existing messages. Otherwise, the attribute will be overwritten entirely. Default is False. This is useful when you want to record messages incrementally, e.g., log input messages first, and then log output messages later.
Example:
import mlflow from mlflow.tracing import set_span_chat_messages @mlflow.trace def f(): messages = [{"role": "user", "content": "hello"}] span = mlflow.get_current_active_span() set_span_chat_messages(span, messages) return 0 f()
Set the mlflow.chat.tools attribute on the specified span. This attribute is used in the UI, and also by downstream applications that consume trace data, such as MLflow evaluate.
span â The LiveSpan to add the attribute to
tools â
A list of standardized chat tool definitions (refer to the spec for details)
Example:
import mlflow from mlflow.tracing import set_span_chat_tools tools = [ { "type": "function", "function": { "name": "add", "description": "Add two numbers", "parameters": { "type": "object", "properties": { "a": {"type": "number"}, "b": {"type": "number"}, }, "required": ["a", "b"], }, }, } ] @mlflow.trace def f(): span = mlflow.get_current_active_span() set_span_chat_tools(span, tools) return 0 f()
Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing a Databricks tracing server.
By setting this destination in the mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.
If neither experiment_id nor experiment_name is specified, an active experiment when traces are created will be used as the destination. If both are specified, they must refer to the same experiment.
The ID of the experiment to log traces to.
Optional[str]
The name of the experiment to log traces to.
Optional[str]
Type of the destination.
Note
Experimental: This class may change or be removed in a future release without warning.
A destination representing an MLflow experiment.
By setting this destination in the mlflow.tracing.set_destination()
function, MLflow will log traces to the specified experiment.
The ID of the experiment to log traces to. If not specified, the current active experiment will be used.
Optional[str]
The tracking URI of the MLflow server to log traces to. If not specified, the current tracking URI will be used.
Optional[str]
Type of the destination.
Note
Experimental: This class may change or be removed in a future release without warning.
A configuration object for specifying the destination of trace data.
Type of the destination.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4