title | titleSuffix | description | services | ms.service | ms.subservice | ms.topic | author | ms.author | ms.date | ms.reviewer | ms.custom |
---|---|---|---|---|---|---|---|---|---|---|---|
Deploy models using online endpoints with REST APIs |
Azure Machine Learning |
Learn how to deploy models using online endpoints with REST APIs. |
machine-learning |
machine-learning |
core |
how-to |
rsethur |
seramasu |
12/22/2021 |
laobri |
devplatv2, event-tier1-build-2022 |
Learn how to use the Azure Machine Learning REST API to deploy models.
The REST API uses standard HTTP verbs to create, retrieve, update, and delete resources. The REST API works with any language or tool that can make HTTP requests. REST's straightforward structure makes it a good choice in scripting environments and for MLOps automation.
In this article, you learn how to use the new REST APIs to:
[!div class="checklist"]
- Create machine learning assets
- Create a basic training job
- Create a hyperparameter tuning sweep job
- An Azure subscription for which you have administrative rights. If you don't have such a subscription, try the free or paid personal subscription.
- An Azure Machine Learning workspace.
- A service principal in your workspace. Administrative REST requests use service principal authentication.
- A service principal authentication token. Follow the steps in Retrieve a service principal authentication token to retrieve this token.
- The curl utility. The curl program is available in the Windows Subsystem for Linux or any UNIX distribution. In PowerShell, curl is an alias for Invoke-WebRequest and
curl -d "key=val" -X POST uri
becomesInvoke-WebRequest -Body "key=val" -Method POST -Uri uri
.
Note
Endpoint names need to be unique at the Azure region level. For example, there can be only one endpoint with the name my-endpoint in westus2.
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="set_endpoint_name":::
Online endpoints allow you to deploy your model without having to create and manage the underlying infrastructure as well as Kubernetes clusters. In this article, you'll create an online endpoint and deployment, and validate it by invoking it. But first you'll have to register the assets needed for deployment, including model, code, and environment.
There are many ways to create an Azure Machine Learning online endpoints including the Azure CLI, and visually with the studio. The following example an online endpoint with the REST API.
First, set up your Azure Machine Learning assets to configure your job.
In the following REST API calls, we use SUBSCRIPTION_ID
, RESOURCE_GROUP
, LOCATION
, and WORKSPACE
as placeholders. Replace the placeholders with your own values.
Administrative REST requests a service principal authentication token. Replace TOKEN
with your own value. You can retrieve this token with the following command:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="get_access_token":::
The service provider uses the api-version
argument to ensure compatibility. The api-version
argument varies from service to service. Set the API version as a variable to accommodate future versions:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="api_version":::
To register the model and code, first they need to be uploaded to a storage account. The details of the storage account are available in the data store. In this example, you get the default datastore and Azure Storage account for your workspace. Query your workspace with a GET request to get a JSON file with the information.
You can use the tool jq to parse the JSON result and get the required values. You can also use the Azure portal to find the same information:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="get_storage_details":::
Now that you have the datastore, you can upload the scoring script. Use the Azure Storage CLI to upload a blob into your default container:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="upload_code":::
Tip
You can also use other methods to upload, such as the Azure portal or Azure Storage Explorer.
Once you upload your code, you can specify your code with a PUT request and refer to the datastore with datastoreId
:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="create_code":::
Similar to the code, Upload the model files:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="upload_model":::
Now, register the model:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="create_model":::
The deployment needs to run in an environment that has the required dependencies. Create the environment with a PUT request. Use a docker image from Microsoft Container Registry. You can configure the docker image with Docker
and add conda dependencies with condaFile
.
In the following snippet, the contents of a Conda environment (YAML file) has been read into an environment variable:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="create_environment":::
Create the online endpoint:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="create_endpoint":::
Create a deployment under the endpoint:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="create_deployment":::
We need the scoring uri and access token to invoke the endpoint. First get the scoring uri:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="get_endpoint":::
Get the endpoint access token:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="get_access_token":::
Now, invoke the endpoint using curl:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="score_endpoint":::
Check the deployment logs:
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="get_deployment_logs":::
If you aren't going use the deployment, you should delete it with the below command (it deletes the endpoint and all the underlying deployments):
:::code language="rest-api" source="~/azureml-examples-main/cli/deploy-rest.sh" id="delete_endpoint":::
- Learn how to deploy your model using the Azure CLI.
- Learn how to deploy your model using studio.
- Learn to Troubleshoot online endpoints deployment and scoring
- Learn how to Access Azure resources with a online endpoint and managed identity
- Learn how to monitor online endpoints.
- Learn Safe rollout for online endpoints.
- View costs for an Azure Machine Learning managed online endpoint.
- Managed online endpoints SKU list.
- Learn about limits on managed online endpoints in Manage and increase quotas for resources with Azure Machine Learning.