title | titleSuffix | description | ms.service | ms.subservice | ms.topic | ms.author | author | ms.date | ms.custom |
---|---|---|---|---|---|---|---|---|---|
Update deployed web services |
Azure Machine Learning |
Learn how to refresh a web service that is already deployed in Azure Machine Learning. You can update settings such as model, environment, and entry script. |
machine-learning |
mlops |
how-to |
larryfr |
blackmist |
10/21/2021 |
deploy, cliv1, sdkv1, event-tier1-build-2022 |
[!INCLUDE dev v1]
In this article, you learn how to update a web service that was deployed with Azure Machine Learning.
-
This article assumes you have already deployed a web service with Azure Machine Learning. If you need to learn how to deploy a web service, follow these steps.
-
The code snippets in this article assume that the
ws
variable has already been initialized to your workspace by using the Workflow() constructor or loading a saved configuration with Workspace.from_config(). The following snippet demonstrates how to use the constructor:[!INCLUDE sdk v1]
from azureml.core import Workspace ws = Workspace(subscription_id="mysubscriptionid", resource_group="myresourcegroup", workspace_name="myworkspace")
To update a web service, use the update
method. You can update the web service to use a new model, a new entry script, or new dependencies that can be specified in an inference configuration. For more information, see the documentation for Webservice.update.
See AKS Service Update Method.
See ACI Service Update Method.
Important
When you create a new version of a model, you must manually update each service that you want to use it.
You can not use the SDK to update a web service published from the Azure Machine Learning designer.
Important
Azure Kubernetes Service uses Blobfuse FlexVolume driver for the versions <=1.16 and Blob CSI driver for the versions >=1.17.
Therefore, it is important to re-deploy or update the web service after cluster upgrade in order to deploy to correct blobfuse method for the cluster version.
Note
When an operation is already in progress, any new operation on that same web service will respond with 409 conflict error. For example, If create or update web service operation is in progress and if you trigger a new Delete operation it will throw an error.
Using the SDK
The following code shows how to use the SDK to update the model, environment, and entry script for a web service:
[!INCLUDE sdk v1]
from azureml.core import Environment
from azureml.core.webservice import Webservice
from azureml.core.model import Model, InferenceConfig
# Register new model.
new_model = Model.register(model_path="outputs/sklearn_mnist_model.pkl",
model_name="sklearn_mnist",
tags={"key": "0.1"},
description="test",
workspace=ws)
# Use version 3 of the environment.
deploy_env = Environment.get(workspace=ws,name="myenv",version="3")
inference_config = InferenceConfig(entry_script="score.py",
environment=deploy_env)
service_name = 'myservice'
# Retrieve existing service.
service = Webservice(name=service_name, workspace=ws)
# Update to new model(s).
service.update(models=[new_model], inference_config=inference_config)
service.wait_for_deployment(show_output=True)
print(service.state)
print(service.get_logs())
Using the CLI
You can also update a web service by using the ML CLI. The following example demonstrates registering a new model and then updating a web service to use the new model:
[!INCLUDE cli v1]
az ml model register -n sklearn_mnist --asset-path outputs/sklearn_mnist_model.pkl --experiment-name myexperiment --output-metadata-file modelinfo.json
az ml service update -n myservice --model-metadata-file modelinfo.json
Tip
In this example, a JSON document is used to pass the model information from the registration command into the update command.
To update the service to use a new entry script or environment, create an inference configuration file and specify it with the ic
parameter.
For more information, see the az ml service update documentation.
- Troubleshoot a failed deployment
- Create client applications to consume web services
- How to deploy a model using a custom Docker image
- Use TLS to secure a web service through Azure Machine Learning
- Monitor your Azure Machine Learning models with Application Insights
- Collect data for models in production
- Create event alerts and triggers for model deployments