Skip to main content
Get started
Open main menu
Components
Gen AI
Ship high-quality GenAI, fast
Features
Observability
Evaluations
Prompt Registry
App versioning
AI Gateway
Model training
Mastering the ML lifecycle
Features
Experiment tracking
Model evaluation
MLflow models
Model Registry & deployment
Components
Releases
Blog
Docs
Ambassador Program
Get started
Gen AI
Ship high-quality GenAI, fast
Features
Observability
Evaluations
Prompt Registry
App versioning
AI Gateway
Model training
Mastering the ML lifecycle
Features
Experiment tracking
Model evaluation
MLflow models
Model Registry & deployment
One post tagged with "ollama"
View All Tags
Featured
Beyond Autolog: Add MLflow Tracing to a New LLM Provider
In this post, we will show how to add MLflow Tracing to a new LLM provider by adding tracing support to the chat method of the Ollama Python SDK.