Skip to content

Commit

Permalink
Update MLflow Databricks doc and fix broke image refs (#17702)
Browse files Browse the repository at this point in the history
  • Loading branch information
B-Step62 authored Feb 3, 2025
1 parent 8a3feea commit aa44b4e
Show file tree
Hide file tree
Showing 16 changed files with 190 additions and 313 deletions.
Binary file removed docs/docs/_static/integrations/mlflow.gif
Binary file not shown.
Binary file added docs/docs/_static/integrations/mlflow/mlflow.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions docs/docs/community/integrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ We support [a huge number of LLMs](../module_guides/models/llms/modules.md).
Check out our [one-click observability](../module_guides/observability/index.md) page
for full tracing integrations.

## Experiment Tracking

- [MLflow](../../examples/observability/mlflow)

## Structured Outputs

- [Guidance](integrations/guidance.md)
Expand Down
6 changes: 3 additions & 3 deletions docs/docs/examples/llm/databricks.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -85,8 +85,8 @@
"source": [
"\n",
"```bash\n",
"export DATABRICKS_API_KEY=<your api key>\n",
"export DATABRICKS_API_BASE=<your api serving endpoint>\n",
"export DATABRICKS_TOKEN=<your api key>\n",
"export DATABRICKS_SERVING_ENDPOINT=<your api serving endpoint>\n",
"```\n",
"\n",
"Alternatively, you can pass your API key and serving endpoint to the LLM when you init it:"
Expand All @@ -102,7 +102,7 @@
"llm = Databricks(\n",
" model=\"databricks-dbrx-instruct\",\n",
" api_key=\"your_api_key\",\n",
" api_base=\"https://[your-work-space].cloud.databricks.com/serving-endpoints/[your-serving-endpoint]\",\n",
" api_base=\"https://[your-work-space].cloud.databricks.com/serving-endpoints/\",\n",
")"
]
},
Expand Down
389 changes: 152 additions & 237 deletions docs/docs/examples/observability/MLflow.ipynb

Large diffs are not rendered by default.

Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file removed docs/docs/examples/observability/mlflow_ui_run.png
Binary file not shown.
40 changes: 0 additions & 40 deletions docs/docs/examples/workflow/function_calling_agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -42,46 +42,6 @@
"Set up tracing to visualize each step in the workflow."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install \"llama-index-core>=0.10.43\" \"openinference-instrumentation-llama-index>=2.2.2\" \"opentelemetry-proto>=1.12.0\" opentelemetry-exporter-otlp opentelemetry-sdk"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from opentelemetry.sdk import trace as trace_sdk\n",
"from opentelemetry.sdk.trace.export import SimpleSpanProcessor\n",
"from opentelemetry.exporter.otlp.proto.http.trace_exporter import (\n",
" OTLPSpanExporter as HTTPSpanExporter,\n",
")\n",
"from openinference.instrumentation.llama_index import LlamaIndexInstrumentor\n",
"\n",
"\n",
"# Add Phoenix API Key for tracing\n",
"PHOENIX_API_KEY = \"<YOUR-PHOENIX-API-KEY>\"\n",
"os.environ[\"OTEL_EXPORTER_OTLP_HEADERS\"] = f\"api_key={PHOENIX_API_KEY}\"\n",
"\n",
"# Add Phoenix\n",
"span_phoenix_processor = SimpleSpanProcessor(\n",
" HTTPSpanExporter(endpoint=\"https://app.phoenix.arize.com/v1/traces\")\n",
")\n",
"\n",
"# Add them to the tracer\n",
"tracer_provider = trace_sdk.TracerProvider()\n",
"tracer_provider.add_span_processor(span_processor=span_phoenix_processor)\n",
"\n",
"# Instrument the application\n",
"LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
64 changes: 31 additions & 33 deletions docs/docs/module_guides/observability/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,37 @@ llama_index.core.set_global_handler(

![](../../_static/integrations/arize_phoenix.png)


### MLflow

[MLflow](https://mlflow.org/docs/latest/llms/tracing/index.html) is an open-source MLOps/LLMOps platform, focuses on the full lifecycle for machine learning projects, ensuring that each phase is manageable, traceable, and reproducible.
**MLflow Tracing** is an OpenTelemetry-based tracing capability and supports one-click instrumentation for LlamaIndex applications.

#### Usage Pattern

Since MLflow is open-source, you can start using it without any account creation or API key setup. Jump straight into the code after installing the MLflow package!

```python
import mlflow

mlflow.llama_index.autolog() # Enable mlflow tracing
```

![](../../_static/integrations/mlflow/mlflow.gif)

#### Guides

MLflow LlamaIndex integration also provides experiment tracking, evaluation, dependency management, and more. Check out the [MLflow documentation](https://mlflow.org/docs/latest/llms/llama-index/index.html) for more details.

#### Support Table

MLflow Tracing support the full range of LlamaIndex features. Some new features like [AgentWorkflow](https://www.llamaindex.ai/blog/introducing-agentworkflow-a-powerful-system-for-building-ai-agent-systems) requires MLflow >= 2.18.0.

| Streaming | Async | Engine | Agents | Workflow | AgentWorkflow |
| --- | --- | --- | --- | --- | --- |
||||| ✅ (>= 2.18) | ✅ (>= 2.18) |


### OpenLLMetry

[OpenLLMetry](https://github.com/traceloop/openllmetry) is an open-source project based on OpenTelemetry for tracing and monitoring
Expand Down Expand Up @@ -546,39 +577,6 @@ import llama_index.core
llama_index.core.set_global_handler("simple")
```

### MLflow

[MLflow](https://mlflow.org/docs/latest/index.html) is an open-source platform, purpose-built to assist machine learning practitioners and teams in handling the complexities of the machine learning process. MLflow focuses on the full lifecycle for machine learning projects, ensuring that each phase is manageable, traceable, and reproducible.

##### Install

```shell
pip install mlflow>=2.15 llama-index>=0.10.44
```

#### Usage Pattern

```python
import mlflow

mlflow.llama_index.autolog() # Enable mlflow tracing

with mlflow.start_run() as run:
mlflow.llama_index.log_model(
index,
artifact_path="llama_index",
engine_type="query", # Logged engine type for inference
input_example="hi",
registered_model_name="my_llama_index_vector_store",
)
model_uri = f"runs:/{run.info.run_id}/llama_index"

predictions = mlflow.pyfunc.load_model(model_uri).predict("hi")
print(f"Query engine prediction: {predictions}")
```

![](../../_static/integrations/mlflow.gif)

#### Guides

- [MLflow](https://mlflow.org/docs/latest/llms/llama-index/index.html)
Expand Down

0 comments on commit aa44b4e

Please sign in to comment.