title | titleSuffix | description | services | ms.service | ms.subservice | ms.author | author | ms.date | ms.topic | ms.reviewer | ms.custom | ms.devlang |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Deploy MLflow models to online endpoint (preview) |
Azure Machine Learning |
Learn to deploy your MLflow model as a web service that's automatically managed by Azure. |
machine-learning |
machine-learning |
core |
ssambare |
shivanissambare |
03/31/2022 |
how-to |
larryfr |
deploy, mlflow, devplatv2, no-code-deployment, devx-track-azurecli, cliv2, event-tier1-build-2022 |
azurecli |
[!INCLUDE cli v2]
[!div class="op_single_selector" title1="Select the version of Azure Machine Learning CLI extension you are using:"]
In this article, learn how to deploy your MLflow model to an online endpoint (preview) for real-time inference. When you deploy your MLflow model to an online endpoint, it's a no-code-deployment so you don't have to provide a scoring script or an environment.
You only provide the typical MLflow model folder contents:
- MLmodel file
conda.yaml
- model file(s)
For no-code-deployment, Azure Machine Learning
- Dynamically installs Python packages provided in the
conda.yaml
file, this means the dependencies are installed during container runtime.- The base container image/curated environment used for dynamic installation is
mcr.microsoft.com/azureml/mlflow-ubuntu18.04-py37-cpu-inference
orAzureML-mlflow-ubuntu18.04-py37-cpu-inference
- The base container image/curated environment used for dynamic installation is
- Provides a MLflow base image/curated environment that contains the following items:
azureml-inference-server-http
mlflow-skinny
pandas
- The scoring script baked into the image.
[!INCLUDE basic cli prereqs]
-
You must have a MLflow model. The examples in this article are based on the models from https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/mlflow.
- If you don't have an MLflow formatted model, you can convert your custom ML model to MLflow format.
[!INCLUDE clone repo & set defaults]
In this code snippets used in this article, the ENDPOINT_NAME
environment variable contains the name of the endpoint to create and use. To set this, use the following command from the CLI. Replace <YOUR_ENDPOINT_NAME>
with the name of your endpoint:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="set_endpoint_name":::
[!INCLUDE cli v2]
This example shows how you can deploy an MLflow model to an online endpoint using CLI (v2).
Important
For MLflow no-code-deployment, testing via local endpoints is currently not supported.
-
Create a YAML configuration file for your endpoint. The following example configures the name and authentication mode of the endpoint:
create-endpoint.yaml
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/create-endpoint.yaml":::
-
To create a new endpoint using the YAML configuration, use the following command:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_endpoint":::
-
Create a YAML configuration file for the deployment. The following example configures a deployment of the
sklearn-diabetes
model to the endpoint created in the previous step:[!IMPORTANT] For MLflow no-code-deployment (NCD) to work, setting
type
tomlflow_model
is required,type: mlflow_model
. For more information, see CLI (v2) model YAML schema.sklearn-deployment.yaml
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/mlflow/sklearn-deployment.yaml":::
-
To create the deployment using the YAML configuration, use the following command:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="create_sklearn_deployment":::
Once your deployment completes, use the following command to make a scoring request to the deployed endpoint. The sample-request-sklearn.json file used in this command is located in the /cli/endpoints/online/mlflow
directory of the azure-examples repo:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="test_sklearn_deployment":::
sample-request-sklearn.json
:::code language="json" source="~/azureml-examples-main/cli/endpoints/online/mlflow/sample-request-sklearn.json":::
The response will be similar to the following text:
[
11633.100167144921,
8522.117402884991
]
Once you're done with the endpoint, use the following command to delete it:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint-mlflow.sh" ID="delete_endpoint":::
This example shows how you can deploy an MLflow model to an online endpoint using Azure Machine Learning studio.
-
Register your model in MLflow format using the following YAML and CLI command. The YAML uses a scikit-learn MLflow model from https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/mlflow.
sample-create-mlflow-model.yaml
$schema: https://azuremlschemas.azureedge.net/latest/model.schema.json name: sklearn-diabetes-mlflow version: 1 path: sklearn-diabetes/model type: mlflow_model description: Scikit-learn MLflow model.
az ml model create -f sample-create-mlflow-model.yaml
-
From studio, select your workspace and then use either the endpoints or models page to create the endpoint deployment:
-
From the Endpoints page, Select +Create (preview).
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/create-from-endpoints.png" alt-text="Screenshot showing create option on the Endpoints UI page.":::
-
Provide a name and authentication type for the endpoint, and then select Next.
-
When selecting a model, select the MLflow model registered previously. Select Next to continue.
-
When you select a model registered in MLflow format, in the Environment step of the wizard, you don't need a scoring script or an environment.
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/ncd-wizard.png" alt-text="Screenshot showing no code and environment needed for MLflow models":::
-
Complete the wizard to deploy the model to the endpoint.
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/review-screen-ncd.png" alt-text="Screenshot showing NCD review screen":::
-
Select the MLflow model, and then select Deploy. When prompted, select Deploy to real-time endpoint (preview).
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/deploy-from-models-ui.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/deploy-from-models-ui.png" alt-text="Screenshot showing how to deploy model from Models UI":::
-
Complete the wizard to deploy the model to the endpoint.
-
This section helps you understand how to deploy models to an online endpoint once you have completed your training job.
-
Download the outputs from the training job. The outputs contain the model folder.
[!NOTE] If you have used
mlflow.autolog()
in your training script, you will see model artifacts in the job's run history. Azure Machine Learning integrates with MLflow's tracking functionality. You can usemlflow.autolog()
for several common ML frameworks to log model parameters, performance metrics, model artifacts, and even feature importance graphs.For more information, see Train models with CLI. Also see the training job samples in the GitHub repository.
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/download-output-logs.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/download-output-logs.png" alt-text="Screenshot showing how to download Outputs and logs from Experimentation run":::
az ml job download -n $run_id --outputs
-
To deploy using the downloaded files, you can use either studio or the Azure command-line interface. Use the model folder from the outputs for deployment:
To learn more, review these articles:
- Deploy models with REST (preview)
- Create and use online endpoints (preview) in the studio
- Safe rollout for online endpoints (preview)
- How to autoscale managed online endpoints
- Use batch endpoints (preview) for batch scoring
- View costs for an Azure Machine Learning managed online endpoint (preview)
- Access Azure resources with an online endpoint and managed identity (preview)
- Troubleshoot online endpoint deployment