Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit b280d26

Browse files
committedDec 18, 2020
update edge articles
1 parent f0b4d1f commit b280d26

File tree

4 files changed

+50
-149
lines changed

4 files changed

+50
-149
lines changed
 

‎articles/stream-analytics/TOC.yml

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,9 @@
4040
href: stream-analytics-edge-csharp-udf.md
4141
- name: 5 - Create custom .NET deserializer
4242
href: custom-deserializer.md
43+
- name: Create an IoT Edge job
44+
href: /azure/iot-edge/tutorial-deploy-stream-analytics?view=iotedge-2018-06
45+
maintainContext: true
4346
- name: Samples
4447
items:
4548
- name: Code samples
@@ -129,6 +132,8 @@
129132
href: repartition.md
130133
- name: Increase throughput of your job
131134
href: stream-analytics-scale-jobs.md
135+
- name: IoT Edge
136+
href: stream-analytics-edge.md
132137
- name: States of a job
133138
href: job-states.md
134139
- name: Window functions
@@ -208,8 +213,6 @@
208213
href: stream-analytics-twitter-sentiment-analysis-trends.md
209214
- name: Real-time fraud detection
210215
href: stream-analytics-real-time-fraud-detection.md
211-
- name: Run jobs on IoT Edge
212-
href: stream-analytics-edge.md
213216
- name: Run jobs on Azure Stack
214217
href: on-azure-stack.md
215218
- name: Toll booth sensor data analysis

‎articles/stream-analytics/stream-analytics-edge.md

Lines changed: 32 additions & 146 deletions
Original file line numberDiff line numberDiff line change
@@ -5,143 +5,54 @@ ms.service: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
77
ms.reviewer: mamccrea
8-
ms.topic: how-to
8+
ms.topic: conceptual
99
ms.date: 10/29/2020
10-
ms.custom: seodec18
10+
ms.custom: contperf-fy21q2
1111
---
1212

1313
# Azure Stream Analytics on IoT Edge
1414

15-
Azure Stream Analytics (ASA) on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data. Azure Stream Analytics is designed for low latency, resiliency, efficient use of bandwidth, and compliance. Enterprises can now deploy control logic close to the industrial operations and complement Big Data analytics done in the cloud.
15+
Azure Stream Analytics on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data. Azure Stream Analytics is designed for low latency, resiliency, efficient use of bandwidth, and compliance. Enterprises can deploy control logic close to the industrial operations and complement Big Data analytics done in the cloud.
1616

17-
Azure Stream Analytics on IoT Edge runs within the [Azure IoT Edge](https://azure.microsoft.com/campaigns/iot-edge/) framework. Once the job is created in ASA, you can deploy and manage it using IoT Hub.
17+
Azure Stream Analytics on IoT Edge runs within the [Azure IoT Edge](https://azure.microsoft.com/campaigns/iot-edge/) framework. Once the job is created in Stream Analytics, you can deploy and manage it using IoT Hub.
1818

19-
## Scenarios
20-
![High-level diagram of IoT Edge](media/stream-analytics-edge/ASAedge-highlevel-diagram.png)
19+
## Common scenarios
2120

22-
* **Low-latency command and control**: For example, manufacturing safety systems must respond to operational data with ultra-low latency. With ASA on IoT Edge, you can analyze sensor data in near real-time, and issue commands when you detect anomalies to stop a machine or trigger alerts.
23-
* **Limited connectivity to the cloud**: Mission critical systems, such as remote mining equipment, connected vessels, or offshore drilling, need to analyze and react to data even when cloud connectivity is intermittent. With ASA, your streaming logic runs independently of the network connectivity and you can choose what you send to the cloud for further processing or storage.
24-
* **Limited bandwidth**: The volume of data produced by jet engines or connected cars can be so large that data must be filtered or pre-processed before sending it to the cloud. Using ASA, you can filter or aggregate the data that needs to be sent to the cloud.
25-
* **Compliance**: Regulatory compliance may require some data to be locally anonymized or aggregated before being sent to the cloud.
26-
27-
## Edge jobs in Azure Stream Analytics
28-
### What is an "edge" job?
29-
30-
ASA Edge jobs run in containers deployed to [Azure IoT Edge devices](../iot-edge/about-iot-edge.md). They are composed of two parts:
31-
1. A cloud part that is responsible for job definition: users define inputs, output, query, and other settings (out of order events, etc.) in the cloud.
32-
2. A module running on your IoT devices. It contains the ASA engine and receives the job definition from the cloud.
33-
34-
ASA uses IoT Hub to deploy edge jobs to device(s). More information about [IoT Edge deployment can be seen here](../iot-edge/module-deployment-monitoring.md).
35-
36-
![Azure Stream Analytics Edge job](media/stream-analytics-edge/stream-analytics-edge-job.png)
37-
38-
39-
### Installation instructions
40-
The high-level steps are described in the following table. More details are given in the following sections.
41-
42-
| Step | Notes |
43-
| --- | --- |
44-
| **Create a storage container** | Storage containers are used to save your job definition where they can be accessed by your IoT devices. <br> You can reuse any existing storage container. |
45-
| **Create an ASA edge job** | Create a new job, select **Edge** as **hosting environment**. <br> These jobs are created/managed from the cloud, and run on your own IoT Edge devices. |
46-
| **Setup your IoT Edge environment on your device(s)** | Instructions for [Windows](../iot-edge/quickstart.md) or [Linux](../iot-edge/quickstart-linux.md).|
47-
| **Deploy ASA on your IoT Edge device(s)** | ASA job definition is exported to the storage container created earlier. |
48-
49-
You can follow [this step-by-step tutorial](../iot-edge/tutorial-deploy-stream-analytics.md) to deploy your first ASA job on IoT Edge. The following video should help you understand the process to run a Stream Analytics job on an IoT edge device:
50-
51-
52-
> [!VIDEO https://channel9.msdn.com/Events/Connect/2017/T157/player]
53-
54-
#### Create a storage container
55-
A storage container is required in order to export the ASA compiled query and the job configuration. It is used to configure the ASA Docker image with your specific query.
56-
1. Follow [these instructions](../storage/common/storage-account-create.md) to create a storage account from the Azure portal. You can keep all default options to use this account with ASA.
57-
2. In the newly created storage account, create a blob storage container:
58-
1. Click on **Blobs**, then **+ Container**.
59-
2. Enter a name and keep the container as **Private**.
60-
61-
#### Create an ASA Edge job
62-
> [!Note]
63-
> This tutorial focuses on ASA job creation using Azure portal. You can also [use Visual Studio plugin to create an ASA Edge job](./stream-analytics-tools-for-visual-studio-edge-jobs.md)
64-
65-
1. From the Azure portal, create a new "Stream Analytics job". [Direct link to create a new ASA job here](https://ms.portal.azure.com/#create/Microsoft.StreamAnalyticsJob).
66-
67-
2. In the creation screen, select **Edge** as **hosting environment** (see the following picture)
68-
69-
![Create Stream Analytics job on Edge](media/stream-analytics-edge/create-asa-edge-job.png)
70-
3. Job Definition
71-
1. **Define Input Stream(s)**. Define one or several input streams for your job.
72-
2. Define Reference data (optional).
73-
3. **Define Output Stream(s)**. Define one or several outputs streams for your job.
74-
4. **Define query**. Define the ASA query in the cloud using the inline editor. The compiler automatically checks the syntax enabled for ASA edge. You can also test your query by uploading sample data.
75-
76-
4. Set the storage container information in the **IoT Edge settings** menu.
77-
78-
5. Set optional settings
79-
1. **Event ordering**. You can configure out-of-order policy in the portal. Documentation is available [here](/stream-analytics-query/time-skew-policies-azure-stream-analytics).
80-
2. **Locale**. Set the internalization format.
21+
This section describes the common scenarios for Stream Analytics on IoT Edge. The following diagram shows the flow of data between IoT devices and the Azure cloud.
8122

23+
:::image type="content" source="media/stream-analytics-edge/edge-high-level-diagram.png" alt-text="High level diagram of IoT Edge":::
8224

25+
### Low-latency command and control
8326

84-
> [!Note]
85-
> When a deployment is created, ASA exports the job definition to a storage container. This job definition remain the same during the duration of a deployment.
86-
> As a consequence, if you want to update a job running on the edge, you need to edit the job in ASA, and then create a new deployment in IoT Hub.
27+
Manufacturing safety systems must respond to operational data with ultra-low latency. With Stream Analytics on IoT Edge, you can analyze sensor data in near real-time, and issue commands when you detect anomalies to stop a machine or trigger alerts.
8728

29+
### Limited connectivity to the cloud
8830

89-
#### Set up your IoT Edge environment on your device(s)
90-
Edge jobs can be deployed on devices running Azure IoT Edge.
91-
For this, you need to follow these steps:
92-
- Create an Iot Hub.
93-
- Install Docker and IoT Edge runtime on your edge devices.
94-
- Set your devices as **IoT Edge devices** in IoT Hub.
31+
Mission critical systems, such as remote mining equipment, connected vessels, or offshore drilling, need to analyze and react to data even when cloud connectivity is intermittent. With Stream Analytics, your streaming logic runs independently of the network connectivity and you can choose what you send to the cloud for further processing or storage.
9532

96-
These steps are described in the IoT Edge documentation for [Windows](../iot-edge/quickstart.md) or [Linux](../iot-edge/quickstart-linux.md).
33+
### Limited bandwidth
9734

35+
The volume of data produced by jet engines or connected cars can be so large that data must be filtered or pre-processed before sending it to the cloud. Using Stream Analytics, you can filter or aggregate the data that needs to be sent to the cloud.
9836

99-
#### Deployment ASA on your IoT Edge device(s)
100-
##### Add ASA to your deployment
101-
- In the Azure portal, open IoT Hub, navigate to **IoT Edge** and click on the device you want to target for this deployment.
102-
- Select **Set modules**, then select **+ Add** and choose **Azure Stream Analytics Module**.
103-
- Select the subscription and the ASA Edge job that you created. Click Save.
104-
![Add ASA module in your deployment](media/stream-analytics-edge/add-stream-analytics-module.png)
37+
### Compliance
10538

39+
Regulatory compliance may require some data to be locally anonymized or aggregated before being sent to the cloud.
10640

107-
> [!Note]
108-
> During this step, ASA creates a folder named "EdgeJobs" in the storage container (if it does not exist already). For each deployment, a new subfolder is created in the "EdgeJobs" folder.
109-
> When you deploy your job to IoT Edge devices, ASA creates a shared access signature (SAS) for the job definition file. The SAS key is securely transmitted to the IoT Edge devices using device twin. The expiration of this key is three years from the day of its creation.
110-
> When you update an IoT Edge job, the SAS will change, but the image version will not change. Once you **Update**, follow the deployment workflow, and an update notification is logged on the device.
111-
112-
113-
For more information about IoT Edge deployments, see to [this page](../iot-edge/module-deployment-monitoring.md).
114-
41+
## Edge jobs in Azure Stream Analytics
11542

116-
##### Configure routes
117-
IoT Edge provides a way to declaratively route messages between modules, and between modules and IoT Hub. The full syntax is described [here](../iot-edge/module-composition.md).
118-
Names of the inputs and outputs created in the ASA job can be used as endpoints for routing.
43+
Stream Analytics Edge jobs run in containers deployed to [Azure IoT Edge devices](../iot-edge/about-iot-edge.md). Edge jobs are composed of two parts:
11944

120-
###### Example
45+
* A cloud part that is responsible for the job definition: users define inputs, output, query, and other settings, such as out of order events, in the cloud.
12146

122-
```json
123-
{
124-
"routes": {
125-
"sensorToAsa": "FROM /messages/modules/tempSensor/* INTO BrokeredEndpoint(\"/modules/ASA/inputs/temperature\")",
126-
"alertsToCloud": "FROM /messages/modules/ASA/* INTO $upstream",
127-
"alertsToReset": "FROM /messages/modules/ASA/* INTO BrokeredEndpoint(\"/modules/tempSensor/inputs/control\")"
128-
}
129-
}
47+
* A module running on your IoT devices. The module contains the Stream Analytics engine and receives the job definition from the cloud.
13048

131-
```
132-
This example shows the routes for the scenario described in the following picture. It contains an edge job called "**ASA**", with an input named "**temperature**" and an output named "**alert**".
133-
![Diagram example of message routing](media/stream-analytics-edge/edge-message-routing-example.png)
49+
Stream Analytics uses IoT Hub to deploy edge jobs to device(s). For more information, see [IoT Edge deployment](../iot-edge/module-deployment-monitoring.md).
13450

135-
This example defines the following routes:
136-
- Every message from the **tempSensor** is sent to the module named **ASA** to the input named **temperature**,
137-
- All outputs of **ASA** module are sent to the IoT Hub linked to this device ($upstream),
138-
- All outputs of **ASA** module are sent to the **control** endpoint of the **tempSensor**.
51+
:::image type="content" source="media/stream-analytics-edge/stream-analytics-edge-job.png" alt-text="Azure Stream Analytics Edge job":::
13952

53+
## Edge job limitations
14054

141-
## Technical information
142-
### Current limitations for IoT Edge jobs compared to cloud jobs
143-
The goal is to have parity between IoT Edge jobs and cloud jobs. Most SQL query language features are supported, enabling to run the same logic on both cloud and IoT Edge.
144-
However the following features are not yet supported for edge jobs:
55+
The goal is to have parity between IoT Edge jobs and cloud jobs. Most SQL query language features are supported for both edge and cloud. However, the following features are not supported for edge jobs:
14556
* User-defined functions (UDF) in JavaScript. UDF are available in [C# for IoT Edge jobs](./stream-analytics-edge-csharp-udf.md) (preview).
14657
* User-defined aggregates (UDA).
14758
* Azure ML functions.
@@ -153,47 +64,22 @@ However the following features are not yet supported for edge jobs:
15364
* Late arrival policy
15465

15566
### Runtime and hardware requirements
156-
To run ASA on IoT Edge, you need devices that can run [Azure IoT Edge](https://azure.microsoft.com/campaigns/iot-edge/).
157-
158-
ASA and Azure IoT Edge use **Docker** containers to provide a portable solution that runs on multiple host operating systems (Windows, Linux).
67+
To run Stream Analytics on IoT Edge, you need devices that can run [Azure IoT Edge](https://azure.microsoft.com/campaigns/iot-edge/).
15968

160-
ASA on IoT Edge is made available as Windows and Linux images, running on both x86-64 or ARM (Advanced RISC Machines) architectures.
161-
162-
163-
### Input and output
164-
#### Input and Output Streams
165-
ASA Edge jobs can get inputs and outputs from other modules running on IoT Edge devices. To connect from and to specific modules, you can set the routing configuration at deployment time. More information is described on [the IoT Edge module composition documentation](../iot-edge/module-composition.md).
166-
167-
For both inputs and outputs, CSV and JSON formats are supported.
69+
Stream Analytics and Azure IoT Edge use **Docker** containers to provide a portable solution that runs on multiple host operating systems (Windows, Linux).
16870

169-
For each input and output stream you create in your ASA job, a corresponding endpoint is created on your deployed module. These endpoints can be used in the routes of your deployment.
71+
Stream Analytics on IoT Edge is made available as Windows and Linux images, running on both x86-64 or ARM (Advanced RISC Machines) architectures.
17072

171-
At present, the only supported stream input and stream output types are Edge Hub. Reference input supports reference file type. Other outputs can be reached using a cloud job downstream. For example, a Stream Analytics job hosted in Edge sends output to Edge Hub, which can then send output to IoT Hub. You can use a second cloud hosted Azure Stream Analytics job with input from IoT Hub and output to Power BI or another output type.
17273

74+
## Input and output
17375

76+
Stream Analytics Edge jobs can get inputs and outputs from other modules running on IoT Edge devices. To connect from and to specific modules, you can set the routing configuration at deployment time. More information is described on [the IoT Edge module composition documentation](../iot-edge/module-composition.md).
17477

175-
##### Reference data
176-
Reference data (also known as a lookup table) is a finite data set that is static or slow changing in nature. It is used to perform a lookup or to correlate with your data stream. To make use of reference data in your Azure Stream Analytics job, you will generally use a [Reference Data JOIN](/stream-analytics-query/reference-data-join-azure-stream-analytics) in your query. For more information, see the [Using reference data for lookups in Stream Analytics](stream-analytics-use-reference-data.md).
177-
178-
Only local reference data is supported. When a job is deployed to IoT Edge device, it loads reference data from the user defined file path.
179-
180-
To create a job with reference data on Edge:
181-
182-
1. Create a new input for your job.
183-
184-
2. Choose **Reference data** as the **Source Type**.
185-
186-
3. Have a reference data file ready on the device. For a Windows container, put the reference data file on the local drive and share the local drive with the Docker container. For a Linux container, create a Docker volume and populate the data file to the volume.
187-
188-
4. Set the file path. For Windows Host OS and Windows container, use the absolute path: `E:\<PathToFile>\v1.csv`. For a Windows Host OS and Linux container or a Linux OS and Linux container, use the path in the volume: `<VolumeName>/file1.txt`.
189-
190-
![New reference data input for Azure Stream Analytics job on IoT Edge](./media/stream-analytics-edge/Reference-Data-New-Input.png)
78+
For both inputs and outputs, CSV and JSON formats are supported.
19179

192-
The reference data on IoT Edge update is triggered by a deployment. Once triggered, the ASA module picks the updated data without stopping the running job.
80+
For each input and output stream you create in your Stream Analytics job, a corresponding endpoint is created on your deployed module. These endpoints can be used in the routes of your deployment.
19381

194-
There are two ways to update the reference data:
195-
* Update reference data path in your ASA job from Azure portal.
196-
* Update the IoT Edge deployment.
82+
The only supported stream input and stream output type is Edge Hub. Reference input supports reference file type. Other outputs can be reached using a cloud job downstream. For example, a Stream Analytics job hosted in Edge sends output to Edge Hub, which can then send output to IoT Hub. You can use a second cloud-hosted Azure Stream Analytics job with input from IoT Hub and output to Power BI or another output type.
19783

19884
## License and third-party notices
19985
* [Azure Stream Analytics on IoT Edge license](https://go.microsoft.com/fwlink/?linkid=862827).
@@ -228,7 +114,7 @@ For further assistance, try the [Microsoft Q&A question page for Azure Stream An
228114
## Next steps
229115

230116
* [More information on Azure Iot Edge](../iot-edge/about-iot-edge.md)
231-
* [ASA on IoT Edge tutorial](../iot-edge/tutorial-deploy-stream-analytics.md)
117+
* [Stream Analytics on IoT Edge tutorial](../iot-edge/tutorial-deploy-stream-analytics.md)
232118
* [Develop Stream Analytics Edge jobs using Visual Studio tools](./stream-analytics-tools-for-visual-studio-edge-jobs.md)
233119
* [Implement CI/CD for Stream Analytics using APIs](stream-analytics-cicd-api.md)
234120

‎articles/stream-analytics/stream-analytics-use-reference-data.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: jeanb
66
ms.reviewer: mamccrea
77
ms.service: stream-analytics
88
ms.topic: conceptual
9-
ms.date: 12/2/2020
9+
ms.date: 12/18/2020
1010
---
1111
# Using reference data for lookups in Stream Analytics
1212

@@ -132,6 +132,18 @@ FROM Step1
132132
JOIN refData2 ON refData2.Desc = Step1.Desc
133133
```
134134

135+
## IoT Edge jobs
136+
137+
Only local reference data is supported for Stream Analytics edge jobs. When a job is deployed to IoT Edge device, it loads reference data from the user defined file path. Have a reference data file ready on the device. For a Windows container, put the reference data file on the local drive and share the local drive with the Docker container. For a Linux container, create a Docker volume and populate the data file to the volume.
138+
139+
Reference data on IoT Edge update is triggered by a deployment. Once triggered, the Stream Analytics module picks the updated data without stopping the running job.
140+
141+
There are two ways to update the reference data:
142+
143+
* Update reference data path in your Stream Analytics job from Azure portal.
144+
145+
* Update the IoT Edge deployment.
146+
135147
## Next steps
136148
> [!div class="nextstepaction"]
137149
> [Quickstart: Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md)

0 commit comments

Comments
 (0)
Please sign in to comment.