Collect OpenTelemetry Traces into MLflow
OpenTelemetry trace ingestion is supported in MLflow 3.6.0 and above.
OpenTelemetry endpoint (OTLP)
MLflow Server exposes an OTLP endpoint at /v1/traces (OTLP). This endpoint accepts traces from any native OpenTelemetry instrumentation, allowing you to trace applications written in other languages such as Java, Go, Rust, etc.
To use this endpoint, start MLflow Server with a SQL-based backend store. The following command starts MLflow Server with an SQLite backend store:
mlflow server --backend-store-uri sqlite:///mlflow.db
To use other types of SQL databases such as PostgreSQL, MySQL, and MSSQL, change the store URI as described in the backend store documentation.
In your application, configure the server endpoint and set the MLflow experiment ID in the OTLP header x-mlflow-experiment-id.
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:5000/v1/traces
export OTEL_EXPORTER_OTLP_TRACES_HEADERS=x-mlflow-experiment-id=123
As of MLflow 3.6.0, MLflow Server supports only the OTLP/HTTP endpoint. The OTLP/gRPC endpoint is not yet supported.
Basic Example
The following example shows how to collect traces from a FastAPI application using OpenTelemetry FastAPI instrumentation.
import os
import uvicorn
from fastapi import FastAPI
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
# Set the endpoint and header
MLFLOW_TRACKING_URI = "http://localhost:5000"
MLFLOW_EXPERIMENT_ID = "123"
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = f"{MLFLOW_TRACKING_URI}/v1/traces"
os.environ[
"OTEL_EXPORTER_OTLP_TRACES_HEADERS"
] = f"x-mlflow-experiment-id={MLFLOW_EXPERIMENT_ID}"
app = FastAPI()
FastAPIInstrumentor.instrument_app(app)
@app.get("/")
async def root():
return {"message": "Hello, World!"}
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
Using OpenTelemetry Collector
OpenTelemetry Collector is a vendor-agnostic agent that can be used to collect, process, and export traces to various observability platforms. To configure OpenTelemetry Collector to ingest traces into MLflow, use the following configuration:
export MLFLOW_TRACKING_URI=http://localhost:5000
export MLFLOW_EXPERIMENT_ID=123
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
exporters:
otlp:
endpoint: ${MLFLOW_TRACKING_URI}
headers:
x-mlflow-experiment-id: ${MLFLOW_EXPERIMENT_ID}
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp]
docker run -d --name opentelemetry-collector \
-p 4317:4317 \
-p 4318:4318 \
-v $(pwd)/opentelemetry-collector.yaml:/etc/otel/collector/config.yaml \
otel/opentelemetry-collector