Skip to main content

MLflow 2.14.2

· 2 min read
MLflow maintainers

MLflow 2.14.2 is a patch release that includes several important bug fixes and documentation enhancements.

Bug fixes:

  • [Models] Fix an issue with requirements inference error handling when disabling the default warning-only behavior (#12547, @B-Step62)
  • [Models] Fix dependency inference issues with Transformers models saved with the unified API llm/v1/xxx task definitions. (#12551, @B-Step62)
  • [Models / Databricks] Fix an issue with MLlfow log_model introduced in MLflow 2.13.0 that causes Databricks DLT service to crash in some situations (#12514, @WeichenXu123)
  • [Models] Fix an output data structure issue with the predict_stream implementation for LangChain AgentExecutor and other non-Runnable chains (#12518, @B-Step62)
  • [Tracking] Fix an issue with the predict_proba inference method in the sklearn flavor when loading an sklearn pipeline object as pyfunc (#12554, @WeichenXu123)
  • [Tracking] Fix an issue with the Tracing implementation where other services usage of OpenTelemetry would activate MLflow tracing and cause errors (#12457, @B-Step62)
  • [Tracking / Databricks] Correct an issue when running dependency inference in Databricks that can cause duplicate dependency entries to be logged (#12493, @sunishsheth2009)

Documentation updates:

  • [Docs] Add documentation and guides for the MLflow tracing schema (#12521, @BenWilson2)

For a comprehensive list of changes, see the release change log, and check out the latest documentation on mlflow.org.