You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-power-bi-dashboard.md
+19-16Lines changed: 19 additions & 16 deletions
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,11 @@ description: This article describes how to use a real-time Power BI dashboard to
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
-
ms.reviewer: jasonh
7
+
ms.reviewer: mamccrea
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 06/27/2017
10
+
ms.date: 12/07/2018
11
+
ms.custom: seodec18
12
12
---
13
13
# Tutorial: Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
14
14
Azure Stream Analytics enables you to take advantage of one of the leading business intelligence tools, [Microsoft Power BI](https://powerbi.com/). In this article, you learn how create business intelligence tools by using Power BI as an output for your Azure Stream Analytics jobs. You also learn how to create and use a real-time dashboard.
@@ -38,13 +38,13 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
38
38
39
39
4. Under **Sink**, select **Power BI**.
40
40
41
-

41
+

42
42
43
43
5. Click **Authorize**.
44
44
45
45
A window opens where you can provide your Azure credentials for a work or school account.
46
46
47
-

47
+

48
48
49
49
6. Enter your credentials. Be aware then when you enter your credentials, you're also giving permission to the Streaming Analytics job to access your Power BI area.
50
50
@@ -54,7 +54,7 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
54
54
***Dataset Name**: Enter `sa-dataset`. You can use a different name. If you do, make a note of it for later.
55
55
***Table Name**: Enter `fraudulent-calls`. Currently, Power BI output from Stream Analytics jobs can have only one table in a dataset.

58
58
59
59
> [!WARNING]
60
60
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
@@ -84,6 +84,7 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
84
84
>[!NOTE]
85
85
>If you did not name the input `CallStream` in the fraud-detection tutorial, substitute your name for `CallStream` in the **FROM** and **JOIN** clauses in the query.
86
86
87
+
```SQL
87
88
/* Our criteria for fraud:
88
89
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
89
90
@@ -101,6 +102,7 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
101
102
/* Where the switch location is different */
102
103
WHERE CS1.SwitchNum != CS2.SwitchNum
103
104
GROUP BY TumblingWindow(Duration(second, 1))
105
+
```
104
106
105
107
4. Click **Save**.
106
108
@@ -114,7 +116,7 @@ This section is optional, but recommended.
114
116
* Go to the folder where the telcogenerator.exe and modified telcodatagen.exe.config files are.
115
117
* Run the following command:
116
118
117
-
telcodatagen.exe 1000 .2 2
119
+
`telcodatagen.exe 1000 .2 2`
118
120
119
121
2. In the **Query** blade, click the dots next to the `CallStream` input and then select **Sample data from input**.
120
122
@@ -140,7 +142,7 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
140
142
141
143
1. Go to [Powerbi.com](https://powerbi.com) and sign in with your work or school account. If the Stream Analytics job query outputs results, you see that your dataset is already created:
142
144
143
-

145
+

144
146
145
147
2. In your workspace, click **+ Create**.
146
148
@@ -152,15 +154,15 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
152
154
153
155
4. At the top of the window, click **Add tile**, select **CUSTOM STREAMING DATA**, and then click **Next**.

218
220
219
221
Given this configuration, you can change the original query to the following:
220
222
223
+
```SQL
221
224
SELECT
222
225
MAX(hmdt) AS hmdt,
223
226
MAX(temp) AS temp,
@@ -229,7 +232,7 @@ Given this configuration, you can change the original query to the following:
229
232
GROUP BY
230
233
TUMBLINGWINDOW(ss,4),
231
234
dspl
232
-
235
+
```
233
236
234
237
### Renew authorization
235
238
If the password has changed since your job was created or last authenticated, you need to reauthenticate your Power BI account. If Azure Multi-Factor Authentication is configured on your Azure Active Directory (Azure AD) tenant, you also need to renew Power BI authorization every two weeks. If you don't renew, you could see symptoms such as a lack of job output or an `Authenticate user error` in the operation logs.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-previews.md
+1-5Lines changed: 1 addition & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.author: mamccrea
7
7
ms.reviewer: jasonh
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 10/05/2018
10
+
ms.date: 12/07/2018
11
11
---
12
12
13
13
# Azure Stream Analytics preview features
@@ -18,10 +18,6 @@ This article summarizes all the features currently in preview for Azure Stream A
18
18
19
19
The following features are in public preview. You can take advantage of these features today, but don't use them in your production environment.
20
20
21
-
### Azure Stream Analytics on IoT Edge
22
-
23
-
Azure Stream Analytics on IoT Edge allows developers to deploy near-real-time analytics on IoT Edge devices. For more information, see the [Azure Stream Analytics on IoT Edge](stream-analytics-edge.md) documentation.
24
-
25
21
### Integration with Azure Machine Learning
26
22
27
23
You can scale Stream Analytics jobs with Machine Learning (ML) functions. To learn more about how you can use ML functions in your Stream Analytics job, visit [Scale your Stream Analytics job with Azure Machine Learning functions](stream-analytics-scale-with-machine-learning-functions.md). Check out a real-world scenario with [Performing sentiment analysis by using Azure Stream Analytics and Azure Machine Learning](stream-analytics-machine-learning-integration-tutorial.md).
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-real-time-fraud-detection.md
+35-25Lines changed: 35 additions & 25 deletions
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,11 @@ description: Learn how to create a real-time fraud detection solution with Strea
4
4
services: stream-analytics
5
5
author: mamccrea
6
6
ms.author: mamccrea
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 03/28/2017
10
+
ms.date: 12/07/2018
11
+
ms.custom: seodec18
12
12
---
13
13
# Get started using Azure Stream Analytics: Real-time fraud detection
14
14
@@ -38,7 +38,7 @@ Before you start, make sure you have the following:
38
38
>[!NOTE]
39
39
>Windows might block the downloaded .zip file. If you can't unzip it, right-click the file and select **Properties**. If you see the "This file came from another computer and might be blocked to help protect this computer" message, select the **Unblock** option and then click **Apply**.
40
40
41
-
If you want to examine the results of the Streaming Analytics job, you also need a tool for viewing the contents of an Azure Blob Storage container. If you use Visual Studio, you can use [Azure Tools for Visual Studio](https://docs.microsoft.com/azure/vs-azure-tools-storage-resources-server-explorer-browse-manage) or [Visual Studio Cloud Explorer](https://docs.microsoft.com/azure/vs-azure-tools-resources-managing-with-cloud-explorer). Alternatively, you can install standalone tools like [Azure Storage Explorer](http://storageexplorer.com/) or [Azure Explorer](http://www.cerebrata.com/products/azure-explorer/introduction).
41
+
If you want to examine the results of the Streaming Analytics job, you also need a tool for viewing the contents of an Azure Blob Storage container. If you use Visual Studio, you can use [Azure Tools for Visual Studio](https://docs.microsoft.com/azure/vs-azure-tools-storage-resources-server-explorer-browse-manage) or [Visual Studio Cloud Explorer](https://docs.microsoft.com/azure/vs-azure-tools-resources-managing-with-cloud-explorer). Alternatively, you can install standalone tools like [Azure Storage Explorer](https://storageexplorer.com/) or [Cerulean](https://www.cerebrata.com/products/cerulean/features/azure-storage).
42
42
43
43
## Create an Azure Event Hubs to ingest events
44
44
@@ -56,7 +56,7 @@ In this procedure, you first create an event hub namespace, and then you add an
56
56
57
57
3. Select a subscription and create or choose a resource group, then click **Create**.
<imgsrc="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-eventhub-namespace-new-portal.png"alt="Create event hub namespace in Azure portal"width="300px"/>
60
60
61
61
4. When the namespace has finished deploying, find the event hub namespace in your list of Azure resources.
62
62
@@ -66,7 +66,7 @@ In this procedure, you first create an event hub namespace, and then you add an
66
66
67
67
6. Name the new event hub `asa-eh-frauddetection-demo`. You can use a different name. If you do, make a note of it, because you need the name later. You don't need to set any other options for the event hub right now.
7. Paste the connection string into a text editor. You need this connection string for the next section, after you make some small edits to it.
98
98
@@ -119,15 +119,17 @@ Before you start the TelcoGenerator app, you must configure it so that it will s
119
119
120
120
The `<appSettings>` section will look like the following example. (For clarity, the lines are wrapped and some characters have been removed from the authorization token.)
121
121
122
-

122
+

123
123
124
124
4. Save the file.
125
125
126
126
### Start the app
127
127
1. Open a command window and change to the folder where the TelcoGenerator app is unzipped.
128
128
2. Enter the following command:
129
129
130
+
```cmd
130
131
telcodatagen.exe 1000 0.2 2
132
+
```
131
133
132
134
The parameters are:
133
135
@@ -161,7 +163,7 @@ Now that you have a stream of call events, you can set up a Stream Analytics job
161
163
162
164
It's a good idea to place the job and the event hub in the same region for best performance and so that you don't pay to transfer data between regions.
<imgsrc="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-sa-input-new-portal.png"alt="Create Stream Analytics input in portal"width="300px"/>
188
190
189
191
190
192
4. Click **Create**.
@@ -213,7 +215,7 @@ The TelcoGenerator app is sending call records to the event hub, and your Stream
213
215
214
216
5. Set **Minutes** to 3 and then click **OK**.
215
217
216
-

218
+

217
219
218
220
Azure samples 3 minutes' worth of data from the input stream and notifies you when the sample data is ready. (This takes a short while.)
219
221
@@ -226,11 +228,13 @@ As an alternative, you can get a .json file that has sample data in it [from Git
226
228
If you want to archive every event, you can use a pass-through query to read all the fields in the payload of the event.
227
229
228
230
1. In the query window, enter this query:
229
-
230
-
SELECT
231
-
*
232
-
FROM
233
-
CallStream
231
+
232
+
```SQL
233
+
SELECT
234
+
*
235
+
FROM
236
+
CallStream
237
+
```
234
238
235
239
>[!NOTE]
236
240
>As with SQL, keywords are not case-sensitive, and whitespace is not significant.
@@ -251,13 +255,15 @@ In many cases, your analysis doesn't need all the columns from the input stream.
251
255
252
256
1. Change the query in the code editor to the following:

266
+

261
267
262
268
### Count incoming calls by region: Tumbling window with aggregation
263
269
@@ -267,11 +273,13 @@ For this transformation, you want a sequence of temporal windows that don't over
267
273
268
274
1. Change the query in the code editor to the following:
269
275
276
+
```SQL
270
277
SELECT
271
278
System.Timestamp as WindowEnd, SwitchNum, COUNT(*) as CallCount
272
279
FROM
273
280
CallStream TIMESTAMP BY CallRecTime
274
281
GROUP BY TUMBLINGWINDOW(s, 5), SwitchNum
282
+
```
275
283
276
284
This query uses the `Timestamp By` keyword in the `FROM` clause to specify which timestamp field in the input stream to use to define the Tumbling window. In this case, the window divides the data into segments by the `CallRecTime` field in each record. (If no field is specified, the windowing operation uses the time that each event arrives at the event hub. See "Arrival Time Vs Application Time" in [Stream Analytics Query Language Reference](https://msdn.microsoft.com/library/azure/dn834998.aspx).
277
285
@@ -281,7 +289,7 @@ For this transformation, you want a sequence of temporal windows that don't over
281
289
282
290
2. Click **Test** again. In the results, notice that the timestamps under **WindowEnd** are in 5-second increments.
283
291
284
-

292
+

285
293
286
294
### Detect SIM fraud using a self-join
287
295
@@ -293,6 +301,7 @@ When you use a join with streaming data, the join must provide some limits on ho
293
301
294
302
1. Change the query in the code editor to the following:
295
303
304
+
```SQL
296
305
SELECT System.Timestamp as Time,
297
306
CS1.CallingIMSI,
298
307
CS1.CallingNum as CallingNum1,
@@ -304,6 +313,7 @@ When you use a join with streaming data, the join must provide some limits on ho
304
313
ON CS1.CallingIMSI = CS2.CallingIMSI
305
314
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1 AND 5
306
315
WHERE CS1.SwitchNum != CS2.SwitchNum
316
+
```
307
317
308
318
This query is like any SQL join except for the `DATEDIFF` function in the join. This version of `DATEDIFF` is specific to Streaming Analytics, and it must appear in the `ON...BETWEEN` clause. The parameters are a time unit (seconds in this example) and the aliases of the two sources for the join. This is different from the standard SQL `DATEDIFF` function.
309
319
@@ -315,7 +325,7 @@ When you use a join with streaming data, the join must provide some limits on ho
315
325
316
326
3. Click **Save** to save the self-join query as part of the Streaming Analytics job. (It doesn't save the sample data.)
<imgsrc="./media/stream-analytics-real-time-fraud-detection/stream-analytics-query-editor-save-button-new-portal.png"alt="Save Stream Analytics query in portal"width="300px"/>
319
329
320
330
## Create an output sink to store transformed data
321
331
@@ -329,7 +339,7 @@ If you have an existing blob storage account, you can use that. For this tutoria
329
339
330
340
1. From the upper left-hand corner of the Azure portal, select **Create a resource** > **Storage** > **Storage account**. Fill out the Storage account job page with **Name** set to "asaehstorage", **Location** set to "East US", **Resource group** set to "asa-eh-ns-rg" (host the storage account in the same resource group as the Streaming job for increased performance). The remaining settings can be left to their default values.

333
343
334
344
2. In the Azure portal, return to the Streaming Analytics job pane. (If you closed the pane, search for `asa_frauddetection_job_demo` in the **All resources** pane.)
335
345
@@ -344,7 +354,7 @@ If you have an existing blob storage account, you can use that. For this tutoria
344
354
|Storage account | asaehstorage | Enter the name of the storage account you created. |
345
355
|Container | asa-fraudulentcalls-demo | Choose Create new and enter a container name. |
<imgsrc="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-output-blob-storage-new-console.png"alt="Create blob output for Stream Analytics job"width="300px"/>
348
358
349
359
5. Click **Save**.
350
360
@@ -365,7 +375,7 @@ The job is now configured. You've specified an input (the event hub), a transfor
365
375
366
376
You now have a complete Streaming Analytics job. The job is examining a stream of phone call metadata, looking for fraudulent phone calls in real time, and writing information about those fraudulent calls to storage.
367
377
368
-
To complete this tutorial, you might want to look at the data being captured by the Streaming Analytics job. The data is being written to Azure Blog Storage in chunks (files). You can use any tool that reads Azure Blob Storage. As noted in the Prerequisites section, you can use Azure extensions in Visual Studio, or you can use a tool like [Azure Storage Explorer](http://storageexplorer.com/) or [Azure Explorer](http://www.cerebrata.com/products/azure-explorer/introduction).
378
+
To complete this tutorial, you might want to look at the data being captured by the Streaming Analytics job. The data is being written to Azure Blog Storage in chunks (files). You can use any tool that reads Azure Blob Storage. As noted in the Prerequisites section, you can use Azure extensions in Visual Studio, or you can use a tool like [Azure Storage Explorer](https://storageexplorer.com/) or [Cerulean](https://www.cerebrata.com/products/cerulean/features/azure-storage).
369
379
370
380
When you examine the contents of a file in blob storage, you see something like the following:
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-scale-jobs.md
+10-9Lines changed: 10 additions & 9 deletions
Original file line number
Diff line number
Diff line change
@@ -43,15 +43,16 @@ If your query is not embarrassingly parallel, you can follow the following steps
43
43
44
44
Query:
45
45
46
-
WITH Step1 AS (
47
-
SELECT COUNT(*) AS Count, TollBoothId, PartitionId
48
-
FROM Input1 Partition By PartitionId
49
-
GROUP BY TumblingWindow(minute, 3), TollBoothId, PartitionId
50
-
)
51
-
SELECT SUM(Count) AS Count, TollBoothId
52
-
FROM Step1
53
-
GROUP BY TumblingWindow(minute, 3), TollBoothId
54
-
46
+
```SQL
47
+
WITH Step1 AS (
48
+
SELECTCOUNT(*) AS Count, TollBoothId, PartitionId
49
+
FROM Input1 Partition By PartitionId
50
+
GROUP BY TumblingWindow(minute, 3), TollBoothId, PartitionId
51
+
)
52
+
SELECTSUM(Count) AS Count, TollBoothId
53
+
FROM Step1
54
+
GROUP BY TumblingWindow(minute, 3), TollBoothId
55
+
```
55
56
In the query above, you are counting cars per toll booth per partition, and then adding the count from all partitions together.
56
57
57
58
Once partitioned, for each partition of the step, allocate up to 6 SU, each partition having 6 SU is the maximum, so each partition can be placed on its own processing node.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-scale-with-machine-learning-functions.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,6 @@ description: This article describes how to scale Stream Analytics jobs that use
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
@@ -40,14 +39,15 @@ The following example includes a Stream Analytics job with the sentiment analysi
40
39
41
40
The query is a simple fully partitioned query followed by the **sentiment** function, as shown following:
42
41
42
+
```SQL
43
43
WITH subquery AS (
44
44
SELECTtext, sentiment(text) as result from input
45
45
)
46
46
47
47
Selecttext, result.[Score]
48
48
Into output
49
49
From subquery
50
-
50
+
```
51
51
Consider the following scenario; with a throughput of 10,000 tweets per second a Stream Analytics job must be created to perform sentiment analysis of the tweets (events). Using 1 SU, could this Stream Analytics job be able to handle the traffic? Using the default batch size of 1000 the job should be able to keep up with the input. Further the added Machine Learning function should generate no more than a second of latency, which is the general default latency of the sentiment analysis Machine Learning web service (with a default batch size of 1000). The Stream Analytics job’s **overall** or end-to-end latency would typically be a few seconds. Take a more detailed look into this Stream Analytics job, *especially* the Machine Learning function calls. Having the batch size as 1000, a throughput of 10,000 events take about 10 requests to web service. Even with 1 SU, there are enough concurrent connections to accommodate this input traffic.
52
52
53
53
If the input event rate increases by 100x, then the Stream Analytics job needs to process 1,000,000 tweets per second. There are two options to accomplish the increased scale:
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-set-up-alerts.md
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,11 @@ description: This article describes how to use the Azure portal to set up monito
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
-
ms.reviewer: jasonh
7
+
ms.reviewer: mamccrea
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 06/26/2017
10
+
ms.date: 12/07/2018
11
+
ms.custom: seodec18
12
12
---
13
13
# Set up alerts for Azure Stream Analytics jobs
14
14
You can set up alerts to trigger an alert when a metric reaches a condition that you specify. For example, you might set up an alert for a condition like the following:
@@ -24,7 +24,7 @@ Rules can be set up on metrics through the portal, or can be configured [program
24
24
25
25
3. In the **Metric** blade, click the **Add alert** command.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-stream-analytics-query-patterns.md
+32-13Lines changed: 32 additions & 13 deletions
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,6 @@ title: Common query patterns in Azure Stream Analytics
3
3
description: This article describes a number of common query patterns and designs that are useful in Azure Stream Analytics jobs.
4
4
services: stream-analytics
5
5
author: jseb225
6
-
manager: kfile
7
6
ms.author: jeanb
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
@@ -43,6 +42,7 @@ For example, the car weight is coming on the input stream as strings and needs t
43
42
44
43
**Solution**:
45
44
45
+
```SQL
46
46
SELECT
47
47
Make,
48
48
SUM(CAST(Weight ASBIGINT)) AS Weight
@@ -51,6 +51,7 @@ For example, the car weight is coming on the input stream as strings and needs t
51
51
GROUP BY
52
52
Make,
53
53
TumblingWindow(second, 10)
54
+
```
54
55
55
56
**Explanation**:
56
57
Use a **CAST** statement in the **Weight** field to specify its data type. See the list of supported data types in [Data types (Azure Stream Analytics)](https://msdn.microsoft.com/library/azure/dn835065.aspx).
@@ -76,12 +77,14 @@ For example, check that the result returns license plates that start with A and
76
77
77
78
**Solution**:
78
79
80
+
```SQL
79
81
SELECT
80
82
*
81
83
FROM
82
84
Input TIMESTAMP BY Time
83
85
WHERE
84
86
LicensePlate LIKE'A%9'
87
+
```
85
88
86
89
**Explanation**:
87
90
Use the **LIKE** statement to check the **LicensePlate** field value. It should start with an A, then have any string of zero or more characters, and then end with a 9.
@@ -107,6 +110,7 @@ For example, provide a string description for how many cars of the same make pas
107
110
108
111
**Solution**:
109
112
113
+
```SQL
110
114
SELECT
111
115
CASE
112
116
WHEN COUNT(*) =1 THEN CONCAT('1 ', Make)
@@ -118,6 +122,7 @@ For example, provide a string description for how many cars of the same make pas
118
122
GROUP BY
119
123
Make,
120
124
TumblingWindow(second, 10)
125
+
```
121
126
122
127
**Explanation**:
123
128
The **CASE** expression compares an expression to a set of simple expressions to determine the result. In this example, vehicle makes with a count of 1 returned a different string description than vehicle makes with a count other than 1.
@@ -154,6 +159,7 @@ For example, analyze data for a threshold-based alert and archive all events to
154
159
155
160
**Solution**:
156
161
162
+
```SQL
157
163
SELECT
158
164
*
159
165
INTO
@@ -174,6 +180,7 @@ For example, analyze data for a threshold-based alert and archive all events to
174
180
TumblingWindow(second, 10)
175
181
HAVING
176
182
[Count] >=3
183
+
```
177
184
178
185
**Explanation**:
179
186
The **INTO** clause tells Stream Analytics which of the outputs to write the data to from this statement.
@@ -183,6 +190,7 @@ The second query does some simple aggregation and filtering, and it sends the re
183
190
Note that you can also reuse the results of the common table expressions (CTEs) (such as **WITH** statements) in multiple output statements. This option has the added benefit of opening fewer readers to the input source.
184
191
For example:
185
192
193
+
```SQL
186
194
WITH AllRedCars AS (
187
195
SELECT
188
196
*
@@ -193,6 +201,7 @@ For example:
193
201
)
194
202
SELECT* INTO HondaOutput FROM AllRedCars WHERE Make ='Honda'
195
203
SELECT* INTO ToyotaOutput FROM AllRedCars WHERE Make ='Toyota'
204
+
```
196
205
197
206
## Query example: Count unique values
198
207
**Description**: Count the number of unique field values that appear in the stream within a time window.
@@ -217,14 +226,14 @@ For example, how many unique makes of cars passed through the toll booth in a 2-
217
226
218
227
**Solution:**
219
228
220
-
````
229
+
```SQL
221
230
SELECT
222
231
COUNT(DISTINCT Make) AS CountMake,
223
232
System.TIMESTAMPASTIME
224
233
FROM Input TIMESTAMP BY TIME
225
234
GROUP BY
226
235
TumblingWindow(second, 2)
227
-
````
236
+
```
228
237
229
238
230
239
**Explanation:**
@@ -249,13 +258,15 @@ For example, is the previous car on the toll road the same make as the current c
249
258
250
259
**Solution**:
251
260
261
+
```SQL
252
262
SELECT
253
263
Make,
254
264
Time
255
265
FROM
256
266
Input TIMESTAMP BY Time
257
267
WHERE
258
268
LAG(Make, 1) OVER (LIMIT DURATION(minute, 1)) <> Make
269
+
```
259
270
260
271
**Explanation**:
261
272
Use **LAG** to peek into the input stream one event back and get the **Make** value. Then compare it to the **Make** value on the current event and output the event if they are different.
@@ -284,6 +295,7 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
284
295
285
296
**Solution**:
286
297
298
+
```SQL
287
299
SELECT
288
300
LicensePlate,
289
301
Make,
@@ -292,6 +304,7 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
292
304
Input TIMESTAMP BY Time
293
305
WHERE
294
306
IsFirst(minute, 10) =1
307
+
```
295
308
296
309
Now let’s change the problem and find the first car of a particular make in every 10-minute interval.
297
310
@@ -305,6 +318,7 @@ Now let’s change the problem and find the first car of a particular make in ev
305
318
306
319
**Solution**:
307
320
321
+
```SQL
308
322
SELECT
309
323
LicensePlate,
310
324
Make,
@@ -313,6 +327,7 @@ Now let’s change the problem and find the first car of a particular make in ev
313
327
Input TIMESTAMP BY Time
314
328
WHERE
315
329
IsFirst(minute, 10) OVER (PARTITION BY Make) =1
330
+
```
316
331
317
332
## Query example: Find the last event in a window
318
333
**Description**: Find the last car in every 10-minute interval.
@@ -338,6 +353,7 @@ Now let’s change the problem and find the first car of a particular make in ev
338
353
339
354
**Solution**:
340
355
356
+
```SQL
341
357
WITH LastInWindow AS
342
358
(
343
359
SELECT
@@ -356,6 +372,7 @@ Now let’s change the problem and find the first car of a particular make in ev
356
372
INNER JOIN LastInWindow
357
373
ON DATEDIFF(minute, Input, LastInWindow) BETWEEN 0AND10
358
374
ANDInput.Time=LastInWindow.LastEventTime
375
+
```
359
376
360
377
**Explanation**:
361
378
There are two steps in the query. The first one finds the latest time stamp in 10-minute windows. The second step joins the results of the first query with the original stream to find the events that match the last time stamps in each window.
@@ -381,6 +398,7 @@ For example, have 2 consecutive cars from the same make entered the toll road wi
381
398
382
399
**Solution**:
383
400
401
+
```SQL
384
402
SELECT
385
403
Make,
386
404
Time,
@@ -391,6 +409,7 @@ For example, have 2 consecutive cars from the same make entered the toll road wi
391
409
Input TIMESTAMP BY Time
392
410
WHERE
393
411
LAG(Make, 1) OVER (LIMIT DURATION(second, 90)) = Make
412
+
```
394
413
395
414
**Explanation**:
396
415
Use **LAG** to peek into the input stream one event back and get the **Make** value. Compare it to the **MAKE** value in the current event, and then output the event if they are the same. You can also use **LAG** to get data about the previous car.
@@ -413,13 +432,13 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
413
432
414
433
**Solution**:
415
434
416
-
````
435
+
```SQL
417
436
SELECT
418
437
[user], feature, DATEDIFF(second, LAST(Time) OVER (PARTITION BY [user], feature LIMIT DURATION(hour, 1) WHEN Event ='start'), Time) as duration
419
438
FROM input TIMESTAMP BY Time
420
439
WHERE
421
440
Event ='end'
422
-
````
441
+
```
423
442
424
443
**Explanation**:
425
444
Use the **LAST** function to retrieve the last **TIME** value when the event type was **Start**. The **LAST** function uses **PARTITION BY [user]** to indicate that the result is computed per unique user. The query has a 1-hour maximum threshold for the time difference between **Start** and **Stop** events, but is configurable as needed **(LIMIT DURATION(hour, 1)**.
@@ -449,7 +468,7 @@ For example, suppose that a bug resulted in all cars having an incorrect weight
449
468
450
469
**Solution**:
451
470
452
-
````
471
+
```SQL
453
472
WITH SelectPreviousEvent AS
454
473
(
455
474
SELECT
@@ -466,7 +485,7 @@ For example, suppose that a bug resulted in all cars having an incorrect weight
466
485
WHERE
467
486
[weight] <20000
468
487
AND previousWeight >20000
469
-
````
488
+
```
470
489
471
490
**Explanation**:
472
491
Use **LAG** to view the input stream for 24 hours and look for instances where **StartFault** and **StopFault** are spanned by the weight < 20000.
@@ -503,13 +522,14 @@ For example, generate an event every 5 seconds that reports the most recently se
503
522
504
523
**Solution**:
505
524
525
+
```SQL
506
526
SELECT
507
527
System.TimestampAS windowEnd,
508
528
TopOne() OVER (ORDER BY t DESC) AS lastEvent
509
529
FROM
510
530
input TIMESTAMP BY t
511
531
GROUP BY HOPPINGWINDOW(second, 300, 5)
512
-
532
+
```
513
533
514
534
**Explanation**:
515
535
This query generates events every 5 seconds and outputs the last event that was received previously. The [Hopping window](https://msdn.microsoft.com/library/dn835041.aspx"Hopping window--Azure Stream Analytics") duration determines how far back the query looks to find the latest event (300 seconds in this example).
@@ -550,7 +570,7 @@ For example, in an IoT scenario for home ovens, an alert must be generated when
550
570
551
571
**Solution**:
552
572
553
-
````
573
+
```SQL
554
574
WITH max_power_during_last_3_mins AS (
555
575
SELECT
556
576
System.TimeStampAS windowTime,
@@ -584,7 +604,7 @@ WHERE
584
604
t1.sensorName='temp'
585
605
ANDt1.value<=40
586
606
ANDt2.maxPower>10
587
-
````
607
+
```
588
608
589
609
**Explanation**:
590
610
The first query `max_power_during_last_3_mins`, uses the [Sliding window](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics) to find the max value of the power sensor for every device, during the last 3 minutes.
@@ -619,15 +639,14 @@ And then, provided the conditions are met, an alert is generated for the device.
619
639
620
640
**Solution**:
621
641
622
-
````
642
+
```SQL
623
643
SELECT
624
644
TollId,
625
645
COUNT(*) AS Count
626
646
FROM input
627
647
TIMESTAMP BY Time OVER TollId
628
648
GROUP BY TUMBLINGWINDOW(second, 5), TollId
629
-
630
-
````
649
+
```
631
650
632
651
**Explanation**:
633
652
The [TIMESTAMP BY OVER](https://msdn.microsoft.com/azure/stream-analytics/reference/timestamp-by-azure-stream-analytics#over-clause-interacts-with-event-ordering) clause looks at each device timeline separately using substreams. The output events for each TollID are generated as they are computed, meaning that the events are in order with respect to each TollID instead of being reordered as if all devices were on the same clock.
The *arm* command takes the job template and job template parameter files generated through build as input. Then it combines them into a job definition JSON file that can be used with the Stream Analytics PowerShell API.
76
75
77
-
```
76
+
```powershell
78
77
arm -JobTemplate <templateFilePath> -JobParameterFile <jobParameterFilePath> [-OutputFile <asaArmFilePath>]
79
78
```
80
79
Example:
81
-
```
80
+
```powershell
82
81
./tools/SA.exe arm -JobTemplate "ProjectA.JobTemplate.json" -JobParameterFile "ProjectA.JobTemplate.parameters.json" -OutputFile "JobDefinition.json"
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-tools-for-visual-studio-edge-jobs.md
+12-12Lines changed: 12 additions & 12 deletions
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,11 @@ description: This article describes how to author, debug, and create your Stream
4
4
services: stream-analytics
5
5
author: su-jie
6
6
ms.author: sujie
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 03/13/2018
10
+
ms.date: 12/07/2018
11
+
ms.custom: seodec18
12
12
---
13
13
14
14
# Develop Stream Analytics Edge jobs using Visual Studio tools
@@ -27,11 +27,11 @@ You need the following prerequisites to complete this tutorial:
27
27
28
28
From Visual Studio, select **File** > **New** > **Project**. Navigate to the **Templates** list on the left > expand **Azure Stream Analytics** > **Stream Analytics Edge** > **Azure Stream Analytics Edge Application**. Provide a Name, Location, and Solution name for your project and select **OK**.

31
31
32
32
After the project gets created, navigate to the **Solution Explorer** to view the folder hierarchy.
33
33
34
-

34
+

35
35
36
36
37
37
## Choose the correct subscription
@@ -45,7 +45,7 @@ After the project gets created, navigate to the **Solution Explorer** to view th
45
45
1. From the **Solution Explorer**, expand the **Inputs** node you should see an input named **EdgeInput.json**. Double-click to view its settings.
46
46
47
47
2. Make sure that Source Type is set to **Data Stream** > Source is set to **Edge Hub** > Event Serialization Format set to **Json** > and Encoding is set to **UTF8**. Optionally, you can rename the **Input Alias**, let’s leave it as is for this example. In case you rename the input alias, use the name you specified when defining the query. Select **Save** to save the settings.
@@ -54,7 +54,7 @@ After the project gets created, navigate to the **Solution Explorer** to view th
54
54
1. From the **Solution Explorer**, expand the **Outputs** node you should see an output named **EdgeOutput.json**. Double-click to view its settings.
55
55
56
56
2. Make sure that Sink is set to select **Edge Hub** > Event Serialization Format set to **Json** > and Encoding is set to **UTF8** > and Format is set to **Array**. Optionally, you can rename the **Output Alias**, let’s leave it as is for this example. In case you rename the output alias, use the name you specified when defining the query. Select **Save** to save the settings.

90
90
91
91
5. The command prompt window shows the status of the job. When the job runs successfully, it creates a folder that looks like "2018-02-23-11-31-42" in your project folder path "Visual Studio 2015\Projects\MyASAEdgejob\MyASAEdgejob\ASALocalRun\2018-02-23-11-31-42". Navigate to the folder path to view the results in the local folder:
92
92
93
93
You can also sign in to the Azure portal and verify that the job is created.

96
96
97
97
## Submit the job to Azure
98
98
@@ -102,19 +102,19 @@ To test the query locally, you should upload the sample data. You can get sample
102
102
103
103
3. A pop-up window opens, where you can choose to update an existing Edge job or create a new one. When you update an existing job, it will replace all the job configuration, in this scenario, you will publish a new job. Select **Create a New Azure Stream Analytics Job** > enter a name for your job something like **MyASAEdgeJob** > choose the required **Subscription**, **Resource Group**, and **Location** > Select **Submit**.
104
104
105
-

105
+

106
106
107
107
Now your Stream Analytics Edge job has been created, you can refer to the [Run jobs on IoT Edge tutorial](stream-analytics-edge.md) to learn how to deploy it to your devices.
108
108
109
109
## Manage the job
110
110
111
111
You can view the status of job and the job diagram from the Server Explorer. From the **Server Explorer** > **Stream Analytics** > expand the subscription and the resource group where you deployed the Edge job > you can view the MyASAEdgejob with status **Created**. Expand your job node and double-click on it to open the job view.
112
112
113
-

113
+

114
114
115
115
The job view window provides you with operations such as refreshing the job, deleting the job, opening the job from Azure portal etc.
116
116
117
-

117
+

0 commit comments