Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit c751529

Browse files
committedDec 7, 2018
seo fixes
1 parent 25af6cf commit c751529

22 files changed

+121
-95
lines changed
 

‎articles/stream-analytics/stream-analytics-power-bi-dashboard.md

Lines changed: 19 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ description: This article describes how to use a real-time Power BI dashboard to
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
8-
ms.reviewer: jasonh
7+
ms.reviewer: mamccrea
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 06/27/2017
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313
# Tutorial: Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
1414
Azure Stream Analytics enables you to take advantage of one of the leading business intelligence tools, [Microsoft Power BI](https://powerbi.com/). In this article, you learn how create business intelligence tools by using Power BI as an output for your Azure Stream Analytics jobs. You also learn how to create and use a real-time dashboard.
@@ -38,13 +38,13 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
3838

3939
4. Under **Sink**, select **Power BI**.
4040

41-
![Create an output for Power BI](./media/stream-analytics-power-bi-dashboard/create-pbi-ouptut.png)
41+
![Create an output for Power BI](./media/stream-analytics-power-bi-dashboard/create-power-bi-ouptut.png)
4242

4343
5. Click **Authorize**.
4444

4545
A window opens where you can provide your Azure credentials for a work or school account.
4646

47-
![Enter credentials for access to Power BI](./media/stream-analytics-power-bi-dashboard/authorize-area.png)
47+
![Enter credentials for access to Power BI](./media/stream-analytics-power-bi-dashboard/power-bi-authorization-credentials.png)
4848

4949
6. Enter your credentials. Be aware then when you enter your credentials, you're also giving permission to the Streaming Analytics job to access your Power BI area.
5050

@@ -54,7 +54,7 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
5454
* **Dataset Name**: Enter `sa-dataset`. You can use a different name. If you do, make a note of it for later.
5555
* **Table Name**: Enter `fraudulent-calls`. Currently, Power BI output from Stream Analytics jobs can have only one table in a dataset.
5656

57-
![PBI workspace](./media/stream-analytics-power-bi-dashboard/create-pbi-ouptut-with-dataset-table.png)
57+
![Power BI workspace dataset and table](./media/stream-analytics-power-bi-dashboard/create-pbi-ouptut-with-dataset-table.png)
5858

5959
> [!WARNING]
6060
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
@@ -84,6 +84,7 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
8484
>[!NOTE]
8585
>If you did not name the input `CallStream` in the fraud-detection tutorial, substitute your name for `CallStream` in the **FROM** and **JOIN** clauses in the query.
8686
87+
```SQL
8788
/* Our criteria for fraud:
8889
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
8990

@@ -101,6 +102,7 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
101102
/* Where the switch location is different */
102103
WHERE CS1.SwitchNum != CS2.SwitchNum
103104
GROUP BY TumblingWindow(Duration(second, 1))
105+
```
104106

105107
4. Click **Save**.
106108

@@ -114,7 +116,7 @@ This section is optional, but recommended.
114116
* Go to the folder where the telcogenerator.exe and modified telcodatagen.exe.config files are.
115117
* Run the following command:
116118

117-
telcodatagen.exe 1000 .2 2
119+
`telcodatagen.exe 1000 .2 2`
118120

119121
2. In the **Query** blade, click the dots next to the `CallStream` input and then select **Sample data from input**.
120122

@@ -140,7 +142,7 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
140142

141143
1. Go to [Powerbi.com](https://powerbi.com) and sign in with your work or school account. If the Stream Analytics job query outputs results, you see that your dataset is already created:
142144

143-
![Streaming dataset in Power BI](./media/stream-analytics-power-bi-dashboard/streaming-dataset.png)
145+
![Streaming dataset location in Power BI](./media/stream-analytics-power-bi-dashboard/stream-analytics-streaming-dataset.png)
144146

145147
2. In your workspace, click **+ Create**.
146148

@@ -152,15 +154,15 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
152154

153155
4. At the top of the window, click **Add tile**, select **CUSTOM STREAMING DATA**, and then click **Next**.
154156

155-
![Custom streaming dataset](./media/stream-analytics-power-bi-dashboard/custom-streaming-data.png)
157+
![Custom streaming dataset tile in Power BI](./media/stream-analytics-power-bi-dashboard/custom-streaming-data.png)
156158

157159
5. Under **YOUR DATSETS**, select your dataset and then click **Next**.
158160

159-
![Your streaming dataset](./media/stream-analytics-power-bi-dashboard/your-streaming-dataset.png)
161+
![Your streaming dataset in Power BI](./media/stream-analytics-power-bi-dashboard/your-streaming-dataset.png)
160162

161163
6. Under **Visualization Type**, select **Card**, and then in the **Fields** list, select **fraudulentcalls**.
162164

163-
![Visualization details for new tile](./media/stream-analytics-power-bi-dashboard/add-fraud.png)
165+
![Visualization details for new tile](./media/stream-analytics-power-bi-dashboard/add-fraudulent-calls-tile.png)
164166

165167
7. Click **Next**.
166168

@@ -172,7 +174,7 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
172174

173175
Now you have a fraud counter!
174176

175-
![Fraud counter](./media/stream-analytics-power-bi-dashboard/fraud-counter.png)
177+
![Fraud counter in Power BI dashboard](./media/stream-analytics-power-bi-dashboard/power-bi-fraud-counter-tile.png)
176178

177179
8. Follow the steps again to add a tile (starting with step 4). This time, do the following:
178180

@@ -181,7 +183,7 @@ Your Streaming Analytics job starts looking for fraudulent calls in the incoming
181183
* Add a value and select **fraudulentcalls**.
182184
* For **Time window to display**, select the last 10 minutes.
183185

184-
![Create tile for line chart](./media/stream-analytics-power-bi-dashboard/pbi-create-tile-line-chart.png)
186+
![Create tile for line chart in Power BI](./media/stream-analytics-power-bi-dashboard/pbi-create-tile-line-chart.png)
185187

186188
9. Click **Next**, add a title and subtitle, and click **Apply**.
187189

@@ -204,7 +206,7 @@ Currently, Power BI can be called roughly once per second. Streaming visuals sup
204206

205207
You can use the following equation to compute the value to give your window in seconds:
206208

207-
![Equation1](./media/stream-analytics-power-bi-dashboard/equation1.png)
209+
![Equation to compute value to give window in seconds](./media/stream-analytics-power-bi-dashboard/compute-window-seconds-equation.png)
208210

209211
For example:
210212

@@ -214,10 +216,11 @@ For example:
214216

215217
As a result, the equation becomes:
216218

217-
![Equation2](./media/stream-analytics-power-bi-dashboard/equation2.png)
219+
![Equation based on example criteria](./media/stream-analytics-power-bi-dashboard/power-bi-example-equation.png)
218220

219221
Given this configuration, you can change the original query to the following:
220222

223+
```SQL
221224
SELECT
222225
MAX(hmdt) AS hmdt,
223226
MAX(temp) AS temp,
@@ -229,7 +232,7 @@ Given this configuration, you can change the original query to the following:
229232
GROUP BY
230233
TUMBLINGWINDOW(ss,4),
231234
dspl
232-
235+
```
233236

234237
### Renew authorization
235238
If the password has changed since your job was created or last authenticated, you need to reauthenticate your Power BI account. If Azure Multi-Factor Authentication is configured on your Azure Active Directory (Azure AD) tenant, you also need to renew Power BI authorization every two weeks. If you don't renew, you could see symptoms such as a lack of job output or an `Authenticate user error` in the operation logs.

‎articles/stream-analytics/stream-analytics-previews.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.author: mamccrea
77
ms.reviewer: jasonh
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 10/05/2018
10+
ms.date: 12/07/2018
1111
---
1212

1313
# Azure Stream Analytics preview features
@@ -18,10 +18,6 @@ This article summarizes all the features currently in preview for Azure Stream A
1818

1919
The following features are in public preview. You can take advantage of these features today, but don't use them in your production environment.
2020

21-
### Azure Stream Analytics on IoT Edge
22-
23-
Azure Stream Analytics on IoT Edge allows developers to deploy near-real-time analytics on IoT Edge devices. For more information, see the [Azure Stream Analytics on IoT Edge](stream-analytics-edge.md) documentation.
24-
2521
### Integration with Azure Machine Learning
2622

2723
You can scale Stream Analytics jobs with Machine Learning (ML) functions. To learn more about how you can use ML functions in your Stream Analytics job, visit [Scale your Stream Analytics job with Azure Machine Learning functions](stream-analytics-scale-with-machine-learning-functions.md). Check out a real-world scenario with [Performing sentiment analysis by using Azure Stream Analytics and Azure Machine Learning](stream-analytics-machine-learning-integration-tutorial.md).

‎articles/stream-analytics/stream-analytics-real-time-event-processing-reference-architecture.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@ description: This article describes the reference architecture to achieve real-t
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual

‎articles/stream-analytics/stream-analytics-real-time-fraud-detection.md

Lines changed: 35 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ description: Learn how to create a real-time fraud detection solution with Strea
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 03/28/2017
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313
# Get started using Azure Stream Analytics: Real-time fraud detection
1414

@@ -38,7 +38,7 @@ Before you start, make sure you have the following:
3838
>[!NOTE]
3939
>Windows might block the downloaded .zip file. If you can't unzip it, right-click the file and select **Properties**. If you see the "This file came from another computer and might be blocked to help protect this computer" message, select the **Unblock** option and then click **Apply**.
4040
41-
If you want to examine the results of the Streaming Analytics job, you also need a tool for viewing the contents of an Azure Blob Storage container. If you use Visual Studio, you can use [Azure Tools for Visual Studio](https://docs.microsoft.com/azure/vs-azure-tools-storage-resources-server-explorer-browse-manage) or [Visual Studio Cloud Explorer](https://docs.microsoft.com/azure/vs-azure-tools-resources-managing-with-cloud-explorer). Alternatively, you can install standalone tools like [Azure Storage Explorer](http://storageexplorer.com/) or [Azure Explorer](http://www.cerebrata.com/products/azure-explorer/introduction).
41+
If you want to examine the results of the Streaming Analytics job, you also need a tool for viewing the contents of an Azure Blob Storage container. If you use Visual Studio, you can use [Azure Tools for Visual Studio](https://docs.microsoft.com/azure/vs-azure-tools-storage-resources-server-explorer-browse-manage) or [Visual Studio Cloud Explorer](https://docs.microsoft.com/azure/vs-azure-tools-resources-managing-with-cloud-explorer). Alternatively, you can install standalone tools like [Azure Storage Explorer](https://storageexplorer.com/) or [Cerulean](https://www.cerebrata.com/products/cerulean/features/azure-storage).
4242

4343
## Create an Azure Event Hubs to ingest events
4444

@@ -56,7 +56,7 @@ In this procedure, you first create an event hub namespace, and then you add an
5656

5757
3. Select a subscription and create or choose a resource group, then click **Create**.
5858

59-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-eventhub-namespace-new-portal.png" alt="drawing" width="300px"/>
59+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-eventhub-namespace-new-portal.png" alt="Create event hub namespace in Azure portal" width="300px"/>
6060

6161
4. When the namespace has finished deploying, find the event hub namespace in your list of Azure resources.
6262

@@ -66,7 +66,7 @@ In this procedure, you first create an event hub namespace, and then you add an
6666

6767
6. Name the new event hub `asa-eh-frauddetection-demo`. You can use a different name. If you do, make a note of it, because you need the name later. You don't need to set any other options for the event hub right now.
6868

69-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-eventhub-new-portal.png" alt="drawing" width="400px"/>
69+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-eventhub-new-portal.png" alt="Name event hub in Azure portal" width="400px"/>
7070

7171

7272
7. Click **Create**.
@@ -84,15 +84,15 @@ Before a process can send data to an event hub, the event hub must have a policy
8484

8585
3. Add a policy named `sa-policy-manage-demo` and for **Claim**, select **Manage**.
8686

87-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-shared-access-policy-manage-new-portal.png" alt="drawing" width="300px"/>
87+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-shared-access-policy-manage-new-portal.png" alt="Create shared access policy for Stream Analytics" width="300px"/>
8888

8989
4. Click **Create**.
9090

9191
5. After the policy has been deployed, click it in the list of shared access policies.
9292

9393
6. Find the box labeled **CONNECTION STRING-PRIMARY KEY** and click the copy button next to the connection string.
9494

95-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-shared-access-policy-copy-connection-string-new-portal.png" alt="drawing" width="300px"/>
95+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-shared-access-policy-copy-connection-string-new-portal.png" alt="Stream Analytics shared access policy" width="300px"/>
9696

9797
7. Paste the connection string into a text editor. You need this connection string for the next section, after you make some small edits to it.
9898

@@ -119,15 +119,17 @@ Before you start the TelcoGenerator app, you must configure it so that it will s
119119

120120
The `<appSettings>` section will look like the following example. (For clarity, the lines are wrapped and some characters have been removed from the authorization token.)
121121

122-
![TelcoGenerator app configuration file showing the event hub name and connection string](./media/stream-analytics-real-time-fraud-detection/stream-analytics-telcogenerator-config-file-app-settings.png)
122+
![TelcoGenerator config file shows event hub name and connection string](./media/stream-analytics-real-time-fraud-detection/stream-analytics-telcogenerator-config-file-app-settings.png)
123123

124124
4. Save the file.
125125

126126
### Start the app
127127
1. Open a command window and change to the folder where the TelcoGenerator app is unzipped.
128128
2. Enter the following command:
129129

130+
```cmd
130131
telcodatagen.exe 1000 0.2 2
132+
```
131133

132134
The parameters are:
133135

@@ -161,7 +163,7 @@ Now that you have a stream of call events, you can set up a Stream Analytics job
161163

162164
It's a good idea to place the job and the event hub in the same region for best performance and so that you don't pay to transfer data between regions.
163165

164-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-sa-job-new-portal.png" alt="drawing" width="300px"/>
166+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-sa-job-new-portal.png" alt="Create Stream Analytics job in portal" width="300px"/>
165167

166168
3. Click **Create**.
167169

@@ -184,7 +186,7 @@ Now that you have a stream of call events, you can set up a Stream Analytics job
184186
|Event Hub name | asa-eh-frauddetection-demo | Select the name of your Event Hub. |
185187
|Event Hub policy name | asa-policy-manage-demo | Select the access policy that you created earlier. |
186188
</br>
187-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-sa-input-new-portal.png" alt="drawing" width="300px"/>
189+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-sa-input-new-portal.png" alt="Create Stream Analytics input in portal" width="300px"/>
188190

189191

190192
4. Click **Create**.
@@ -213,7 +215,7 @@ The TelcoGenerator app is sending call records to the event hub, and your Stream
213215

214216
5. Set **Minutes** to 3 and then click **OK**.
215217

216-
![Options for sampling the input stream, with "3 minutes" selected.](./media/stream-analytics-real-time-fraud-detection/stream-analytics-input-create-sample-data.png)
218+
![Options for sampling input stream with 3 minutes selected](./media/stream-analytics-real-time-fraud-detection/stream-analytics-input-create-sample-data.png)
217219

218220
Azure samples 3 minutes' worth of data from the input stream and notifies you when the sample data is ready. (This takes a short while.)
219221

@@ -226,11 +228,13 @@ As an alternative, you can get a .json file that has sample data in it [from Git
226228
If you want to archive every event, you can use a pass-through query to read all the fields in the payload of the event.
227229

228230
1. In the query window, enter this query:
229-
230-
SELECT
231-
*
232-
FROM
233-
CallStream
231+
232+
```SQL
233+
SELECT
234+
*
235+
FROM
236+
CallStream
237+
```
234238

235239
>[!NOTE]
236240
>As with SQL, keywords are not case-sensitive, and whitespace is not significant.
@@ -251,13 +255,15 @@ In many cases, your analysis doesn't need all the columns from the input stream.
251255

252256
1. Change the query in the code editor to the following:
253257

254-
SELECT CallRecTime, SwitchNum, CallingIMSI, CallingNum, CalledNum
255-
FROM
256-
CallStream
258+
```SQL
259+
SELECT CallRecTime, SwitchNum, CallingIMSI, CallingNum, CalledNum
260+
FROM
261+
CallStream
262+
```
257263

258264
2. Click **Test** again.
259265

260-
![Stream Analytics job output for projection, showing 25 records generated](./media/stream-analytics-real-time-fraud-detection/stream-analytics-sa-job-sample-output-projection.png)
266+
![Stream Analytics job output for projection shows 25 records](./media/stream-analytics-real-time-fraud-detection/stream-analytics-sa-job-sample-output-projection.png)
261267

262268
### Count incoming calls by region: Tumbling window with aggregation
263269

@@ -267,11 +273,13 @@ For this transformation, you want a sequence of temporal windows that don't over
267273

268274
1. Change the query in the code editor to the following:
269275

276+
```SQL
270277
SELECT
271278
System.Timestamp as WindowEnd, SwitchNum, COUNT(*) as CallCount
272279
FROM
273280
CallStream TIMESTAMP BY CallRecTime
274281
GROUP BY TUMBLINGWINDOW(s, 5), SwitchNum
282+
```
275283

276284
This query uses the `Timestamp By` keyword in the `FROM` clause to specify which timestamp field in the input stream to use to define the Tumbling window. In this case, the window divides the data into segments by the `CallRecTime` field in each record. (If no field is specified, the windowing operation uses the time that each event arrives at the event hub. See "Arrival Time Vs Application Time" in [Stream Analytics Query Language Reference](https://msdn.microsoft.com/library/azure/dn834998.aspx).
277285

@@ -281,7 +289,7 @@ For this transformation, you want a sequence of temporal windows that don't over
281289

282290
2. Click **Test** again. In the results, notice that the timestamps under **WindowEnd** are in 5-second increments.
283291

284-
![Stream Analytics job output for aggregation, showing 13 records generated](./media/stream-analytics-real-time-fraud-detection/stream-analytics-sa-job-sample-output-aggregation.png)
292+
![Stream Analytics job output for aggregation showing 13 records](./media/stream-analytics-real-time-fraud-detection/stream-analytics-sa-job-sample-output-aggregation.png)
285293

286294
### Detect SIM fraud using a self-join
287295

@@ -293,6 +301,7 @@ When you use a join with streaming data, the join must provide some limits on ho
293301

294302
1. Change the query in the code editor to the following:
295303

304+
```SQL
296305
SELECT System.Timestamp as Time,
297306
CS1.CallingIMSI,
298307
CS1.CallingNum as CallingNum1,
@@ -304,6 +313,7 @@ When you use a join with streaming data, the join must provide some limits on ho
304313
ON CS1.CallingIMSI = CS2.CallingIMSI
305314
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1 AND 5
306315
WHERE CS1.SwitchNum != CS2.SwitchNum
316+
```
307317

308318
This query is like any SQL join except for the `DATEDIFF` function in the join. This version of `DATEDIFF` is specific to Streaming Analytics, and it must appear in the `ON...BETWEEN` clause. The parameters are a time unit (seconds in this example) and the aliases of the two sources for the join. This is different from the standard SQL `DATEDIFF` function.
309319

@@ -315,7 +325,7 @@ When you use a join with streaming data, the join must provide some limits on ho
315325

316326
3. Click **Save** to save the self-join query as part of the Streaming Analytics job. (It doesn't save the sample data.)
317327

318-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-query-editor-save-button-new-portal.png" alt="drawing" width="300px"/>
328+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-query-editor-save-button-new-portal.png" alt="Save Stream Analytics query in portal" width="300px"/>
319329

320330
## Create an output sink to store transformed data
321331

@@ -329,7 +339,7 @@ If you have an existing blob storage account, you can use that. For this tutoria
329339

330340
1. From the upper left-hand corner of the Azure portal, select **Create a resource** > **Storage** > **Storage account**. Fill out the Storage account job page with **Name** set to "asaehstorage", **Location** set to "East US", **Resource group** set to "asa-eh-ns-rg" (host the storage account in the same resource group as the Streaming job for increased performance). The remaining settings can be left to their default values.
331341

332-
![Create storage account](./media/stream-analytics-real-time-fraud-detection/stream-analytics-storage-account-create.png)
342+
![Create storage account in Azure portal](./media/stream-analytics-real-time-fraud-detection/stream-analytics-storage-account-create.png)
333343

334344
2. In the Azure portal, return to the Streaming Analytics job pane. (If you closed the pane, search for `asa_frauddetection_job_demo` in the **All resources** pane.)
335345

@@ -344,7 +354,7 @@ If you have an existing blob storage account, you can use that. For this tutoria
344354
|Storage account | asaehstorage | Enter the name of the storage account you created. |
345355
|Container | asa-fraudulentcalls-demo | Choose Create new and enter a container name. |
346356
<br/>
347-
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-output-blob-storage-new-console.png" alt="drawing" width="300px"/>
357+
<img src="./media/stream-analytics-real-time-fraud-detection/stream-analytics-create-output-blob-storage-new-console.png" alt="Create blob output for Stream Analytics job" width="300px"/>
348358

349359
5. Click **Save**.
350360

@@ -365,7 +375,7 @@ The job is now configured. You've specified an input (the event hub), a transfor
365375

366376
You now have a complete Streaming Analytics job. The job is examining a stream of phone call metadata, looking for fraudulent phone calls in real time, and writing information about those fraudulent calls to storage.
367377

368-
To complete this tutorial, you might want to look at the data being captured by the Streaming Analytics job. The data is being written to Azure Blog Storage in chunks (files). You can use any tool that reads Azure Blob Storage. As noted in the Prerequisites section, you can use Azure extensions in Visual Studio, or you can use a tool like [Azure Storage Explorer](http://storageexplorer.com/) or [Azure Explorer](http://www.cerebrata.com/products/azure-explorer/introduction).
378+
To complete this tutorial, you might want to look at the data being captured by the Streaming Analytics job. The data is being written to Azure Blog Storage in chunks (files). You can use any tool that reads Azure Blob Storage. As noted in the Prerequisites section, you can use Azure extensions in Visual Studio, or you can use a tool like [Azure Storage Explorer](https://storageexplorer.com/) or [Cerulean](https://www.cerebrata.com/products/cerulean/features/azure-storage).
369379

370380
When you examine the contents of a file in blob storage, you see something like the following:
371381

‎articles/stream-analytics/stream-analytics-scale-jobs.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -43,15 +43,16 @@ If your query is not embarrassingly parallel, you can follow the following steps
4343

4444
Query:
4545

46-
WITH Step1 AS (
47-
SELECT COUNT(*) AS Count, TollBoothId, PartitionId
48-
FROM Input1 Partition By PartitionId
49-
GROUP BY TumblingWindow(minute, 3), TollBoothId, PartitionId
50-
)
51-
SELECT SUM(Count) AS Count, TollBoothId
52-
FROM Step1
53-
GROUP BY TumblingWindow(minute, 3), TollBoothId
54-
46+
```SQL
47+
WITH Step1 AS (
48+
SELECT COUNT(*) AS Count, TollBoothId, PartitionId
49+
FROM Input1 Partition By PartitionId
50+
GROUP BY TumblingWindow(minute, 3), TollBoothId, PartitionId
51+
)
52+
SELECT SUM(Count) AS Count, TollBoothId
53+
FROM Step1
54+
GROUP BY TumblingWindow(minute, 3), TollBoothId
55+
```
5556
In the query above, you are counting cars per toll booth per partition, and then adding the count from all partitions together.
5657

5758
Once partitioned, for each partition of the step, allocate up to 6 SU, each partition having 6 SU is the maximum, so each partition can be placed on its own processing node.

‎articles/stream-analytics/stream-analytics-scale-with-machine-learning-functions.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@ description: This article describes how to scale Stream Analytics jobs that use
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
@@ -40,14 +39,15 @@ The following example includes a Stream Analytics job with the sentiment analysi
4039

4140
The query is a simple fully partitioned query followed by the **sentiment** function, as shown following:
4241

42+
```SQL
4343
WITH subquery AS (
4444
SELECT text, sentiment(text) as result from input
4545
)
4646

4747
Select text, result.[Score]
4848
Into output
4949
From subquery
50-
50+
```
5151
Consider the following scenario; with a throughput of 10,000 tweets per second a Stream Analytics job must be created to perform sentiment analysis of the tweets (events). Using 1 SU, could this Stream Analytics job be able to handle the traffic? Using the default batch size of 1000 the job should be able to keep up with the input. Further the added Machine Learning function should generate no more than a second of latency, which is the general default latency of the sentiment analysis Machine Learning web service (with a default batch size of 1000). The Stream Analytics job’s **overall** or end-to-end latency would typically be a few seconds. Take a more detailed look into this Stream Analytics job, *especially* the Machine Learning function calls. Having the batch size as 1000, a throughput of 10,000 events take about 10 requests to web service. Even with 1 SU, there are enough concurrent connections to accommodate this input traffic.
5252

5353
If the input event rate increases by 100x, then the Stream Analytics job needs to process 1,000,000 tweets per second. There are two options to accomplish the increased scale:

‎articles/stream-analytics/stream-analytics-set-up-alerts.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ description: This article describes how to use the Azure portal to set up monito
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
8-
ms.reviewer: jasonh
7+
ms.reviewer: mamccrea
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 06/26/2017
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313
# Set up alerts for Azure Stream Analytics jobs
1414
You can set up alerts to trigger an alert when a metric reaches a condition that you specify. For example, you might set up an alert for a condition like the following:
@@ -24,7 +24,7 @@ Rules can be set up on metrics through the portal, or can be configured [program
2424

2525
3. In the **Metric** blade, click the **Add alert** command.
2626

27-
![Azure portal setup](./media/stream-analytics-set-up-alerts/06-stream-analytics-set-up-alerts.png)
27+
![Azure portal Stream Analytics alerts setup](./media/stream-analytics-set-up-alerts/06-stream-analytics-set-up-alerts.png)
2828

2929
4. Enter a name and a description.
3030

‎articles/stream-analytics/stream-analytics-stream-analytics-query-patterns.md

Lines changed: 32 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@ title: Common query patterns in Azure Stream Analytics
33
description: This article describes a number of common query patterns and designs that are useful in Azure Stream Analytics jobs.
44
services: stream-analytics
55
author: jseb225
6-
manager: kfile
76
ms.author: jeanb
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
@@ -43,6 +42,7 @@ For example, the car weight is coming on the input stream as strings and needs t
4342

4443
**Solution**:
4544

45+
```SQL
4646
SELECT
4747
Make,
4848
SUM(CAST(Weight AS BIGINT)) AS Weight
@@ -51,6 +51,7 @@ For example, the car weight is coming on the input stream as strings and needs t
5151
GROUP BY
5252
Make,
5353
TumblingWindow(second, 10)
54+
```
5455

5556
**Explanation**:
5657
Use a **CAST** statement in the **Weight** field to specify its data type. See the list of supported data types in [Data types (Azure Stream Analytics)](https://msdn.microsoft.com/library/azure/dn835065.aspx).
@@ -76,12 +77,14 @@ For example, check that the result returns license plates that start with A and
7677

7778
**Solution**:
7879

80+
```SQL
7981
SELECT
8082
*
8183
FROM
8284
Input TIMESTAMP BY Time
8385
WHERE
8486
LicensePlate LIKE 'A%9'
87+
```
8588

8689
**Explanation**:
8790
Use the **LIKE** statement to check the **LicensePlate** field value. It should start with an A, then have any string of zero or more characters, and then end with a 9.
@@ -107,6 +110,7 @@ For example, provide a string description for how many cars of the same make pas
107110

108111
**Solution**:
109112

113+
```SQL
110114
SELECT
111115
CASE
112116
WHEN COUNT(*) = 1 THEN CONCAT('1 ', Make)
@@ -118,6 +122,7 @@ For example, provide a string description for how many cars of the same make pas
118122
GROUP BY
119123
Make,
120124
TumblingWindow(second, 10)
125+
```
121126

122127
**Explanation**:
123128
The **CASE** expression compares an expression to a set of simple expressions to determine the result. In this example, vehicle makes with a count of 1 returned a different string description than vehicle makes with a count other than 1.
@@ -154,6 +159,7 @@ For example, analyze data for a threshold-based alert and archive all events to
154159

155160
**Solution**:
156161

162+
```SQL
157163
SELECT
158164
*
159165
INTO
@@ -174,6 +180,7 @@ For example, analyze data for a threshold-based alert and archive all events to
174180
TumblingWindow(second, 10)
175181
HAVING
176182
[Count] >= 3
183+
```
177184

178185
**Explanation**:
179186
The **INTO** clause tells Stream Analytics which of the outputs to write the data to from this statement.
@@ -183,6 +190,7 @@ The second query does some simple aggregation and filtering, and it sends the re
183190
Note that you can also reuse the results of the common table expressions (CTEs) (such as **WITH** statements) in multiple output statements. This option has the added benefit of opening fewer readers to the input source.
184191
For example:
185192

193+
```SQL
186194
WITH AllRedCars AS (
187195
SELECT
188196
*
@@ -193,6 +201,7 @@ For example:
193201
)
194202
SELECT * INTO HondaOutput FROM AllRedCars WHERE Make = 'Honda'
195203
SELECT * INTO ToyotaOutput FROM AllRedCars WHERE Make = 'Toyota'
204+
```
196205

197206
## Query example: Count unique values
198207
**Description**: Count the number of unique field values that appear in the stream within a time window.
@@ -217,14 +226,14 @@ For example, how many unique makes of cars passed through the toll booth in a 2-
217226

218227
**Solution:**
219228

220-
````
229+
```SQL
221230
SELECT
222231
COUNT(DISTINCT Make) AS CountMake,
223232
System.TIMESTAMP AS TIME
224233
FROM Input TIMESTAMP BY TIME
225234
GROUP BY
226235
TumblingWindow(second, 2)
227-
````
236+
```
228237

229238

230239
**Explanation:**
@@ -249,13 +258,15 @@ For example, is the previous car on the toll road the same make as the current c
249258

250259
**Solution**:
251260

261+
```SQL
252262
SELECT
253263
Make,
254264
Time
255265
FROM
256266
Input TIMESTAMP BY Time
257267
WHERE
258268
LAG(Make, 1) OVER (LIMIT DURATION(minute, 1)) <> Make
269+
```
259270

260271
**Explanation**:
261272
Use **LAG** to peek into the input stream one event back and get the **Make** value. Then compare it to the **Make** value on the current event and output the event if they are different.
@@ -284,6 +295,7 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
284295

285296
**Solution**:
286297

298+
```SQL
287299
SELECT
288300
LicensePlate,
289301
Make,
@@ -292,6 +304,7 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
292304
Input TIMESTAMP BY Time
293305
WHERE
294306
IsFirst(minute, 10) = 1
307+
```
295308

296309
Now let’s change the problem and find the first car of a particular make in every 10-minute interval.
297310

@@ -305,6 +318,7 @@ Now let’s change the problem and find the first car of a particular make in ev
305318

306319
**Solution**:
307320

321+
```SQL
308322
SELECT
309323
LicensePlate,
310324
Make,
@@ -313,6 +327,7 @@ Now let’s change the problem and find the first car of a particular make in ev
313327
Input TIMESTAMP BY Time
314328
WHERE
315329
IsFirst(minute, 10) OVER (PARTITION BY Make) = 1
330+
```
316331

317332
## Query example: Find the last event in a window
318333
**Description**: Find the last car in every 10-minute interval.
@@ -338,6 +353,7 @@ Now let’s change the problem and find the first car of a particular make in ev
338353

339354
**Solution**:
340355

356+
```SQL
341357
WITH LastInWindow AS
342358
(
343359
SELECT
@@ -356,6 +372,7 @@ Now let’s change the problem and find the first car of a particular make in ev
356372
INNER JOIN LastInWindow
357373
ON DATEDIFF(minute, Input, LastInWindow) BETWEEN 0 AND 10
358374
AND Input.Time = LastInWindow.LastEventTime
375+
```
359376

360377
**Explanation**:
361378
There are two steps in the query. The first one finds the latest time stamp in 10-minute windows. The second step joins the results of the first query with the original stream to find the events that match the last time stamps in each window.
@@ -381,6 +398,7 @@ For example, have 2 consecutive cars from the same make entered the toll road wi
381398

382399
**Solution**:
383400

401+
```SQL
384402
SELECT
385403
Make,
386404
Time,
@@ -391,6 +409,7 @@ For example, have 2 consecutive cars from the same make entered the toll road wi
391409
Input TIMESTAMP BY Time
392410
WHERE
393411
LAG(Make, 1) OVER (LIMIT DURATION(second, 90)) = Make
412+
```
394413

395414
**Explanation**:
396415
Use **LAG** to peek into the input stream one event back and get the **Make** value. Compare it to the **MAKE** value in the current event, and then output the event if they are the same. You can also use **LAG** to get data about the previous car.
@@ -413,13 +432,13 @@ Use **LAG** to peek into the input stream one event back and get the **Make** va
413432

414433
**Solution**:
415434

416-
````
435+
```SQL
417436
SELECT
418437
[user], feature, DATEDIFF(second, LAST(Time) OVER (PARTITION BY [user], feature LIMIT DURATION(hour, 1) WHEN Event = 'start'), Time) as duration
419438
FROM input TIMESTAMP BY Time
420439
WHERE
421440
Event = 'end'
422-
````
441+
```
423442

424443
**Explanation**:
425444
Use the **LAST** function to retrieve the last **TIME** value when the event type was **Start**. The **LAST** function uses **PARTITION BY [user]** to indicate that the result is computed per unique user. The query has a 1-hour maximum threshold for the time difference between **Start** and **Stop** events, but is configurable as needed **(LIMIT DURATION(hour, 1)**.
@@ -449,7 +468,7 @@ For example, suppose that a bug resulted in all cars having an incorrect weight
449468

450469
**Solution**:
451470

452-
````
471+
```SQL
453472
WITH SelectPreviousEvent AS
454473
(
455474
SELECT
@@ -466,7 +485,7 @@ For example, suppose that a bug resulted in all cars having an incorrect weight
466485
WHERE
467486
[weight] < 20000
468487
AND previousWeight > 20000
469-
````
488+
```
470489

471490
**Explanation**:
472491
Use **LAG** to view the input stream for 24 hours and look for instances where **StartFault** and **StopFault** are spanned by the weight < 20000.
@@ -503,13 +522,14 @@ For example, generate an event every 5 seconds that reports the most recently se
503522

504523
**Solution**:
505524

525+
```SQL
506526
SELECT
507527
System.Timestamp AS windowEnd,
508528
TopOne() OVER (ORDER BY t DESC) AS lastEvent
509529
FROM
510530
input TIMESTAMP BY t
511531
GROUP BY HOPPINGWINDOW(second, 300, 5)
512-
532+
```
513533

514534
**Explanation**:
515535
This query generates events every 5 seconds and outputs the last event that was received previously. The [Hopping window](https://msdn.microsoft.com/library/dn835041.aspx "Hopping window--Azure Stream Analytics") duration determines how far back the query looks to find the latest event (300 seconds in this example).
@@ -550,7 +570,7 @@ For example, in an IoT scenario for home ovens, an alert must be generated when
550570

551571
**Solution**:
552572

553-
````
573+
```SQL
554574
WITH max_power_during_last_3_mins AS (
555575
SELECT
556576
System.TimeStamp AS windowTime,
@@ -584,7 +604,7 @@ WHERE
584604
t1.sensorName = 'temp'
585605
AND t1.value <= 40
586606
AND t2.maxPower > 10
587-
````
607+
```
588608

589609
**Explanation**:
590610
The first query `max_power_during_last_3_mins`, uses the [Sliding window](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics) to find the max value of the power sensor for every device, during the last 3 minutes.
@@ -619,15 +639,14 @@ And then, provided the conditions are met, an alert is generated for the device.
619639

620640
**Solution**:
621641

622-
````
642+
```SQL
623643
SELECT
624644
TollId,
625645
COUNT(*) AS Count
626646
FROM input
627647
TIMESTAMP BY Time OVER TollId
628648
GROUP BY TUMBLINGWINDOW(second, 5), TollId
629-
630-
````
649+
```
631650

632651
**Explanation**:
633652
The [TIMESTAMP BY OVER](https://msdn.microsoft.com/azure/stream-analytics/reference/timestamp-by-azure-stream-analytics#over-clause-interacts-with-event-ordering) clause looks at each device timeline separately using substreams. The output events for each TollID are generated as they are computed, meaning that the events are in order with respect to each TollID instead of being reordered as if all devices were on the same clock.

‎articles/stream-analytics/stream-analytics-test-query.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,14 @@
11
---
22
title: Test an Azure Stream Analytics job with sample data
3-
description: How to test your queries in Stream Analytics jobs.
4-
keywords: This article describes how to use the Azure portal to test an Azure Stream Analytics job, sample input, and upload sample data.
3+
description: This article describes how to use the Azure portal to test an Azure Stream Analytics job, sample input, and upload sample data.
54
services: stream-analytics
6-
author: jasonwhowell
5+
author: mamccrea
76
ms.author: mamccrea
87
ms.reviewer: jasonh
9-
manager: kfile
108
ms.service: stream-analytics
119
ms.topic: conceptual
12-
ms.date: 04/27/2018
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1312
---
1413

1514
# Test a Stream Analytics query with sample data

‎articles/stream-analytics/stream-analytics-tools-for-visual-studio-cicd.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@ description: This article describes how to use Visual Studio tools for Azure Str
44
services: stream-analytics
55
author: su-jie
66
ms.author: sujie
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
@@ -74,11 +73,11 @@ localrun -Project [ProjectFullPath]
7473

7574
The *arm* command takes the job template and job template parameter files generated through build as input. Then it combines them into a job definition JSON file that can be used with the Stream Analytics PowerShell API.
7675

77-
```
76+
```powershell
7877
arm -JobTemplate <templateFilePath> -JobParameterFile <jobParameterFilePath> [-OutputFile <asaArmFilePath>]
7978
```
8079
Example:
81-
```
80+
```powershell
8281
./tools/SA.exe arm -JobTemplate "ProjectA.JobTemplate.json" -JobParameterFile "ProjectA.JobTemplate.parameters.json" -OutputFile "JobDefinition.json"
8382
```
8483

‎articles/stream-analytics/stream-analytics-tools-for-visual-studio-edge-jobs.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ description: This article describes how to author, debug, and create your Stream
44
services: stream-analytics
55
author: su-jie
66
ms.author: sujie
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 03/13/2018
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313

1414
# Develop Stream Analytics Edge jobs using Visual Studio tools
@@ -27,11 +27,11 @@ You need the following prerequisites to complete this tutorial:
2727

2828
From Visual Studio, select **File** > **New** > **Project**. Navigate to the **Templates** list on the left > expand **Azure Stream Analytics** > **Stream Analytics Edge** > **Azure Stream Analytics Edge Application**. Provide a Name, Location, and Solution name for your project and select **OK**.
2929

30-
![New Edge project](./media/stream-analytics-tools-for-visual-studio-edge-jobs/new-edge-project.png)
30+
![New Edge project in Visual Studio](./media/stream-analytics-tools-for-visual-studio-edge-jobs/new-stream-analytics-edge-project.png)
3131

3232
After the project gets created, navigate to the **Solution Explorer** to view the folder hierarchy.
3333

34-
![Solution explorer view](./media/stream-analytics-tools-for-visual-studio-edge-jobs/edge-project-in-solution-explorer.png)
34+
![Solution explorer view of Stream Analytics Edge job](./media/stream-analytics-tools-for-visual-studio-edge-jobs/edge-project-in-solution-explorer.png)
3535

3636

3737
## Choose the correct subscription
@@ -45,7 +45,7 @@ After the project gets created, navigate to the **Solution Explorer** to view th
4545
1. From the **Solution Explorer**, expand the **Inputs** node you should see an input named **EdgeInput.json**. Double-click to view its settings.
4646

4747
2. Make sure that Source Type is set to **Data Stream** > Source is set to **Edge Hub** > Event Serialization Format set to **Json** > and Encoding is set to **UTF8**. Optionally, you can rename the **Input Alias**, let’s leave it as is for this example. In case you rename the input alias, use the name you specified when defining the query. Select **Save** to save the settings.
48-
![Input configuration](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-input-configuration.png)
48+
![Stream Analytics job input configuration](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-input-configuration.png)
4949

5050

5151

@@ -54,7 +54,7 @@ After the project gets created, navigate to the **Solution Explorer** to view th
5454
1. From the **Solution Explorer**, expand the **Outputs** node you should see an output named **EdgeOutput.json**. Double-click to view its settings.
5555

5656
2. Make sure that Sink is set to select **Edge Hub** > Event Serialization Format set to **Json** > and Encoding is set to **UTF8** > and Format is set to **Array**. Optionally, you can rename the **Output Alias**, let’s leave it as is for this example. In case you rename the output alias, use the name you specified when defining the query. Select **Save** to save the settings.
57-
![Output configuration](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-output-configuration.png)
57+
![Stream Analytics job output configuration](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-output-configuration.png)
5858

5959
## Define the transformation query
6060

@@ -82,17 +82,17 @@ To test the query locally, you should upload the sample data. You can get sample
8282
1. To upload sample data, > right click on **EdgeInput.json** file > choose **Add Local Input**
8383

8484
2. In the pop-up window > **Browse** the sample data from your local path > Select **Save**.
85-
![Local input configuration](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-local-input-configuration.png)
85+
![Local input configuration in Visual Studio](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-local-input-configuration.png)
8686

8787
3. A file named **local_EdgeInput.json** is added automatically to your inputs folder.
8888
4. you can either run it locally or submit to Azure. To test the query > Select **Run Locally**.
89-
![Run options](./media/stream-analytics-tools-for-visual-studio-edge-jobs/run-options.png)
89+
![Stream Analytics job run options in Visual Studio](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-visual-stuidio-run-options.png)
9090

9191
5. The command prompt window shows the status of the job. When the job runs successfully, it creates a folder that looks like "2018-02-23-11-31-42" in your project folder path "Visual Studio 2015\Projects\MyASAEdgejob\MyASAEdgejob\ASALocalRun\2018-02-23-11-31-42". Navigate to the folder path to view the results in the local folder:
9292

9393
You can also sign in to the Azure portal and verify that the job is created.
9494

95-
![Result folder](./media/stream-analytics-tools-for-visual-studio-edge-jobs/result-folder.png)
95+
![Stream Analytics job result folder](./media/stream-analytics-tools-for-visual-studio-edge-jobs/stream-analytics-job-result-folder.png)
9696

9797
## Submit the job to Azure
9898

@@ -102,19 +102,19 @@ To test the query locally, you should upload the sample data. You can get sample
102102

103103
3. A pop-up window opens, where you can choose to update an existing Edge job or create a new one. When you update an existing job, it will replace all the job configuration, in this scenario, you will publish a new job. Select **Create a New Azure Stream Analytics Job** > enter a name for your job something like **MyASAEdgeJob** > choose the required **Subscription**, **Resource Group**, and **Location** > Select **Submit**.
104104

105-
![Submit to Azure](./media/stream-analytics-tools-for-visual-studio-edge-jobs/submit-to-azure.png)
105+
![Submit Stream Analytics job to Azure from Visual Studio](./media/stream-analytics-tools-for-visual-studio-edge-jobs/submit-stream-analytics-job-to-azure.png)
106106

107107
Now your Stream Analytics Edge job has been created, you can refer to the [Run jobs on IoT Edge tutorial](stream-analytics-edge.md) to learn how to deploy it to your devices.
108108

109109
## Manage the job
110110

111111
You can view the status of job and the job diagram from the Server Explorer. From the **Server Explorer** > **Stream Analytics** > expand the subscription and the resource group where you deployed the Edge job > you can view the MyASAEdgejob with status **Created**. Expand your job node and double-click on it to open the job view.
112112

113-
![Server explorer options](./media/stream-analytics-tools-for-visual-studio-edge-jobs/server-explorer-options.png)
113+
![Server explorer job management options](./media/stream-analytics-tools-for-visual-studio-edge-jobs/server-explorer-options.png)
114114

115115
The job view window provides you with operations such as refreshing the job, deleting the job, opening the job from Azure portal etc.
116116

117-
![Job diagram and other options](./media/stream-analytics-tools-for-visual-studio-edge-jobs/job-diagram-and-other-options.png)
117+
![Job diagram and other options in Visual Studio](./media/stream-analytics-tools-for-visual-studio-edge-jobs/job-diagram-and-other-options.png)
118118

119119
## Next steps
120120

0 commit comments

Comments
 (0)
Please sign in to comment.