Skip to content

Commit ed3c4d4

Browse files
committedJun 11, 2019
freshness
1 parent 50d40b6 commit ed3c4d4

15 files changed

+79
-95
lines changed
 
Loading
Loading

‎articles/stream-analytics/stream-analytics-add-inputs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,12 +4,12 @@ description: This article describe the concept of inputs in an Azure Stream Anal
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 04/25/2018
10+
ms.date: 06/11/2019
1211
---
12+
1313
# Understand inputs for Azure Stream Analytics
1414

1515
Azure Stream Analytics jobs connect to one or more data inputs. Each input defines a connection to an existing data source. Stream Analytics accepts data incoming from several kinds of event sources including Event Hubs, IoT Hub, and Blob storage. The inputs are referenced by name in the streaming SQL query that you write for each job. In the query, you can join multiple inputs to blend data or compare streaming data with a lookup to reference data, and pass the results to outputs.

‎articles/stream-analytics/stream-analytics-get-started-with-azure-stream-analytics-to-process-data-from-iot-devices.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,17 +7,19 @@ ms.author: mamccrea
77
ms.reviewer: jasonh
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 12/06/2018
11-
ms.custom: seodec18
10+
ms.date: 06/11/2019
1211
---
1312
# Get started with Azure Stream Analytics to process data from IoT devices
14-
In this tutorial, you will learn how to create stream-processing logic to gather data from Internet of Things (IoT) devices. We will use a real-world, Internet of Things (IoT) use case to demonstrate how to build your solution quickly and economically.
13+
14+
In this tutorial, you learn how to create stream-processing logic to gather data from Internet of Things (IoT) devices. We will use a real-world, Internet of Things (IoT) use case to demonstrate how to build your solution quickly and economically.
1515

1616
## Prerequisites
17+
1718
* [Azure subscription](https://azure.microsoft.com/pricing/free-trial/)
1819
* Sample query and data files downloadable from [GitHub](https://aka.ms/azure-stream-analytics-get-started-iot)
1920

2021
## Scenario
22+
2123
Contoso, which is a company in the industrial automation space, has completely automated its manufacturing process. The machinery in this plant has sensors that are capable of emitting streams of data in real time. In this scenario, a production floor manager wants to have real-time insights from the sensor data to look for patterns and take actions on them. We will use the Stream Analytics Query Language (SAQL) over the sensor data to find interesting patterns from the incoming stream of data.
2224

2325
Here data is being generated from a Texas Instruments sensor tag device. The payload of the data is in JSON format and looks like the following:

‎articles/stream-analytics/stream-analytics-power-bi-dashboard.md

Lines changed: 39 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,10 @@ ms.author: jeanb
77
ms.reviewer: mamccrea
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 12/07/2018
11-
ms.custom: seodec18
10+
ms.date: 06/11/2019
1211
---
13-
# Tutorial: Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
12+
# Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
13+
1414
Azure Stream Analytics enables you to take advantage of one of the leading business intelligence tools, [Microsoft Power BI](https://powerbi.com/). In this article, you learn how create business intelligence tools by using Power BI as an output for your Azure Stream Analytics jobs. You also learn how to create and use a real-time dashboard.
1515

1616
This article continues from the Stream Analytics [real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial. It builds on the workflow created in that tutorial and adds a Power BI output so that you can visualize fraudulent phone calls that are detected by a Streaming Analytics job.
@@ -32,41 +32,31 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
3232

3333
1. In the Azure portal, open the Streaming Analytics job that you created earlier. If you used the suggested name, the job is named `sa_frauddetection_job_demo`.
3434

35-
2. Select the **Outputs** box in the middle of the job dashboard and then select **+ Add**.
36-
37-
3. For **Output Alias**, enter `CallStream-PowerBI`. You can use a different name. If you do, make a note of it, because you need the name later.
38-
39-
4. Under **Sink**, select **Power BI**.
40-
41-
![Create an output for Power BI](./media/stream-analytics-power-bi-dashboard/create-power-bi-ouptut.png)
42-
43-
5. Click **Authorize**.
35+
2. On the left menu, select **Outputs** under **Job topology**. Then, select **+ Add** and choose **Power BI** from the dropdown menu.
4436

45-
A window opens where you can provide your Azure credentials for a work or school account.
37+
3. Select **+ Add** > **Power BI**. Then fill the form with the following details and select **Authorize**:
4638

47-
![Enter credentials for access to Power BI](./media/stream-analytics-power-bi-dashboard/power-bi-authorization-credentials.png)
39+
|**Setting** |**Suggested value** |
40+
|---------|---------|
41+
|Output alias | CallStream-PowerBI |
42+
|Dataset name | sa-dataset |
43+
|Table name | fraudulent-calls |
4844

49-
6. Enter your credentials. Be aware then when you enter your credentials, you're also giving permission to the Streaming Analytics job to access your Power BI area.
45+
![Configure Stream Analytics output](media/stream-analytics-power-bi-dashboard/configure-stream-analytics-output.png)
5046

51-
7. When you're returned to the **New output** blade, enter the following information:
47+
> [!WARNING]
48+
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
49+
> We recommend that you do not explicitly create this dataset and table in your Power BI account. They are automatically created when you start your Stream Analytics job and the job starts pumping output into Power BI. If your job query doesn't return any results, the dataset and table are not created.
50+
>
5251
53-
* **Group Workspace**: Select a workspace in your Power BI tenant where you want to create the dataset.
54-
* **Dataset Name**: Enter `sa-dataset`. You can use a different name. If you do, make a note of it for later.
55-
* **Table Name**: Enter `fraudulent-calls`. Currently, Power BI output from Stream Analytics jobs can have only one table in a dataset.
56-
57-
![Power BI workspace dataset and table](./media/stream-analytics-power-bi-dashboard/create-pbi-ouptut-with-dataset-table.png)
58-
59-
> [!WARNING]
60-
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
61-
> We recommend that you do not explicitly create this dataset and table in your Power BI account. They are automatically created when you start your Stream Analytics job and the job starts pumping output into Power BI. If your job query doesn't return any results, the dataset and table are not created.
62-
>
52+
4. When you select **Authorize**, a pop-up window opens and you are asked to provide credentials to authenticate to your Power BI account. Once the authorization is successful, **Save** the settings.
6353

6454
8. Click **Create**.
6555

6656
The dataset is created with the following settings:
6757

68-
* **defaultRetentionPolicy: BasicFIFO**: Data is FIFO, with a maximum of 200,000 rows.
69-
* **defaultMode: pushStreaming**: The dataset supports both streaming tiles and traditional report-based visuals (a.k.a. push).
58+
* **defaultRetentionPolicy: BasicFIFO** - Data is FIFO, with a maximum of 200,000 rows.
59+
* **defaultMode: pushStreaming** - The dataset supports both streaming tiles and traditional report-based visuals (a.k.a. push).
7060

7161
Currently, you can't create datasets with other flags.
7262

@@ -84,54 +74,52 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
8474
>[!NOTE]
8575
>If you did not name the input `CallStream` in the fraud-detection tutorial, substitute your name for `CallStream` in the **FROM** and **JOIN** clauses in the query.
8676
87-
```SQL
88-
/* Our criteria for fraud:
89-
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
77+
```SQL
78+
/* Our criteria for fraud:
79+
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
9080

91-
SELECT System.Timestamp AS WindowEnd, COUNT(*) AS FraudulentCalls
92-
INTO "CallStream-PowerBI"
93-
FROM "CallStream" CS1 TIMESTAMP BY CallRecTime
94-
JOIN "CallStream" CS2 TIMESTAMP BY CallRecTime
81+
SELECT System.Timestamp AS WindowEnd, COUNT(*) AS FraudulentCalls
82+
INTO "CallStream-PowerBI"
83+
FROM "CallStream" CS1 TIMESTAMP BY CallRecTime
84+
JOIN "CallStream" CS2 TIMESTAMP BY CallRecTime
9585

96-
/* Where the caller is the same, as indicated by IMSI (International Mobile Subscriber Identity) */
97-
ON CS1.CallingIMSI = CS2.CallingIMSI
86+
/* Where the caller is the same, as indicated by IMSI (International Mobile Subscriber Identity) */
87+
ON CS1.CallingIMSI = CS2.CallingIMSI
9888

99-
/* ...and date between CS1 and CS2 is between one and five seconds */
100-
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1 AND 5
89+
/* ...and date between CS1 and CS2 is between one and five seconds */
90+
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1 AND 5
10191

102-
/* Where the switch location is different */
103-
WHERE CS1.SwitchNum != CS2.SwitchNum
104-
GROUP BY TumblingWindow(Duration(second, 1))
105-
```
92+
/* Where the switch location is different */
93+
WHERE CS1.SwitchNum != CS2.SwitchNum
94+
GROUP BY TumblingWindow(Duration(second, 1))
95+
```
10696

10797
4. Click **Save**.
10898

10999

110100
## Test the query
101+
111102
This section is optional, but recommended.
112103

113104
1. If the TelcoStreaming app is not currently running, start it by following these steps:
114105

115-
* Open a command window.
106+
* Open Command Prompt.
116107
* Go to the folder where the telcogenerator.exe and modified telcodatagen.exe.config files are.
117108
* Run the following command:
118109

119110
`telcodatagen.exe 1000 .2 2`
120111

121-
2. In the **Query** blade, click the dots next to the `CallStream` input and then select **Sample data from input**.
112+
2. On the **Query** page for your Stream Analytics job, click the dots next to the `CallStream` input and then select **Sample data from input**.
122113

123114
3. Specify that you want three minutes' worth of data and click **OK**. Wait until you're notified that the data has been sampled.
124115

125-
4. Click **Test** and make sure you're getting results.
126-
116+
4. Click **Test** and review the results.
127117

128118
## Run the job
129119

130-
1. Make sure that the TelcoStreaming app is running.
131-
132-
2. Close the **Query** blade.
120+
1. Make sure the TelcoStreaming app is running.
133121

134-
3. In the job blade, click **Start**.
122+
2. Navigate to the **Overview** page for your Stream Analytics job and select **Start**.
135123

136124
![Start the Stream Analytics job](./media/stream-analytics-power-bi-dashboard/stream-analytics-sa-job-start-output.png)
137125

‎articles/stream-analytics/stream-analytics-quick-create-vs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@ description: This quickstart shows you how to get started by creating a Stream A
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
7-
ms.date: 12/20/2018
7+
ms.date: 06/11/2019
88
ms.topic: quickstart
99
ms.service: stream-analytics
10-
ms.custom: mvc
10+
1111
#Customer intent: "As an IT admin/developer I want to create a Stream Analytics job, configure input and output & analyze data by using Visual Studio."
1212
---
1313

‎articles/stream-analytics/stream-analytics-real-time-fraud-detection.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -125,11 +125,12 @@ Before you start the TelcoGenerator app, you must configure it so that it will s
125125

126126
### Start the app
127127
1. Open a command window and change to the folder where the TelcoGenerator app is unzipped.
128+
128129
2. Enter the following command:
129130

130-
```cmd
131-
telcodatagen.exe 1000 0.2 2
132-
```
131+
```cmd
132+
telcodatagen.exe 1000 0.2 2
133+
```
133134

134135
The parameters are:
135136

‎articles/stream-analytics/stream-analytics-set-up-alerts.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,13 @@ ms.author: jeanb
77
ms.reviewer: mamccrea
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 02/05/2019
11-
ms.custom: seodec18
10+
ms.date: 06/11/2019
1211
---
1312
# Set up alerts for Azure Stream Analytics jobs
1413

1514
It's important to monitor your Azure Stream Analytics job to ensure the job is running continuously without any problems. This article describes how to set up alerts for common scenarios that should be monitored.
1615

17-
Rules can be set up on metrics through the portal and can be configured [programmatically](https://code.msdn.microsoft.com/windowsazure/Receive-Email-Notifications-199e2c9a) over Operation Logs data.
16+
You can define rules on metrics from Operation Logs data through the portal, as well as [programmatically](https://code.msdn.microsoft.com/windowsazure/Receive-Email-Notifications-199e2c9a).
1817

1918
## Set up alerts in the Azure portal
2019

@@ -24,15 +23,15 @@ The following example demonstrates how to set up alerts for when your job enters
2423

2524
2. On the **Job** page, navigate to the **Monitoring** section.
2625

27-
3. Select **Metrics**, and then click **New alert rule**.
26+
3. Select **Metrics**, and then **New alert rule**.
2827

2928
![Azure portal Stream Analytics alerts setup](./media/stream-analytics-set-up-alerts/stream-analytics-set-up-alerts.png)
3029

3130
4. Your Stream Analytics job name should automatically appear under **RESOURCE**. Click **Add condition**, and select **All Administrative operations** under **Configure signal logic**.
3231

3332
![Select signal name for Stream Analytics alert](./media/stream-analytics-set-up-alerts/stream-analytics-condition-signal.png)
3433

35-
5. Under **Configure signal logic**, change **Event Level** to **All** and change **Status** to **Failed**. Leave **Event initiated by** blank and click **Done**.
34+
5. Under **Configure signal logic**, change **Event Level** to **All** and change **Status** to **Failed**. Leave **Event initiated by** blank and select **Done**.
3635

3736
![Configure signal logic for Stream Analytics alert](./media/stream-analytics-set-up-alerts/stream-analytics-configure-signal-logic.png)
3837

‎articles/stream-analytics/stream-analytics-use-reference-data.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,13 @@ description: This article describes how to use reference data to lookup or corre
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: mamccrea
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 01/29/2019
10+
ms.date: 06/11/2019
1211
---
1312
# Using reference data for lookups in Stream Analytics
13+
1414
Reference data (also known as a lookup table) is a finite data set that is static or slowly changing in nature, used to perform a lookup or to correlate with your data stream. For example, in an IoT scenario, you could store metadata about sensors (which don’t change often) in reference data and join it with real time IoT data streams. Azure Stream Analytics loads reference data in memory to achieve low latency stream processing. To make use of reference data in your Azure Stream Analytics job, you will generally use a [Reference Data Join](https://msdn.microsoft.com/library/azure/dn949258.aspx) in your query.
1515

1616
Stream Analytics supports Azure Blob storage and Azure SQL Database as the storage layer for Reference Data. You can also transform and/or copy reference data to Blob storage from Azure Data Factory to use [any number of cloud-based and on-premises data stores](../data-factory/copy-activity-overview.md).
@@ -37,7 +37,7 @@ To configure your reference data, you first need to create an input that is of t
3737

3838
### Static reference data
3939

40-
If your reference data is not expected to change, then support for static reference data is enabled by specifying a static path in the input configuration. Azure Stream Analytics picks up the blob from the specified path. {date} and {time} substitution tokens aren't required. Reference data is immutable in Stream Analytics. Therefore, overwriting a static reference data blob is not recommended.
40+
If your reference data is not expected to change, then support for static reference data is enabled by specifying a static path in the input configuration. Azure Stream Analytics picks up the blob from the specified path. {date} and {time} substitution tokens aren't required. Because reference data is immutable in Stream Analytics, overwriting a static reference data blob is not recommended.
4141

4242
### Generate reference data on a schedule
4343

@@ -48,7 +48,7 @@ Azure Stream Analytics automatically scans for refreshed reference data blobs at
4848
> [!NOTE]
4949
> Currently Stream Analytics jobs look for the blob refresh only when the machine time advances to the time encoded in the blob name. For example, the job will look for `sample/2015-04-16/17-30/products.csv` as soon as possible but no earlier than 5:30 PM on April 16th, 2015 UTC time zone. It will *never* look for a blob with an encoded time earlier than the last one that is discovered.
5050
>
51-
> E.g. once the job finds the blob `sample/2015-04-16/17-30/products.csv` it will ignore any files with an encoded date earlier than 5:30 PM April 16th, 2015 so if a late arriving `sample/2015-04-16/17-25/products.csv` blob gets created in the same container the job will not use it.
51+
> For example, once the job finds the blob `sample/2015-04-16/17-30/products.csv` it will ignore any files with an encoded date earlier than 5:30 PM April 16th, 2015 so if a late arriving `sample/2015-04-16/17-25/products.csv` blob gets created in the same container the job will not use it.
5252
>
5353
> Likewise if `sample/2015-04-16/17-30/products.csv` is only produced at 10:03 PM April 16th, 2015 but no blob with an earlier date is present in the container, the job will use this file starting at 10:03 PM April 16th, 2015 and use the previous reference data until then.
5454
>

‎articles/stream-analytics/stream-analytics-window-functions.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,16 +4,16 @@ description: This article describes four windowing functions (tumbling, hopping,
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 05/07/2018
10+
ms.date: 06/11/2019
1211
---
1312
# Introduction to Stream Analytics windowing functions
13+
1414
In time-streaming scenarios, performing operations on the data contained in temporal windows is a common pattern. Stream Analytics has native support for windowing functions, enabling developers to author complex stream processing jobs with minimal effort.
1515

16-
There are four kinds of temporal windows to choose from: [**Tumbling**](https://msdn.microsoft.com/azure/stream-analytics/reference/tumbling-window-azure-stream-analytics), [**Hopping**](https://msdn.microsoft.com/azure/stream-analytics/reference/hopping-window-azure-stream-analytics), [**Sliding**](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics), and [**Session**](https://msdn.microsoft.com/azure/stream-analytics/reference/session-window-azure-stream-analytics) windows. You use the window functions in the [**GROUP BY**](https://msdn.microsoft.com/azure/stream-analytics/reference/group-by-azure-stream-analytics) clause of the query syntax in your Stream Analytics jobs.
16+
There are four kinds of temporal windows to choose from: [**Tumbling**](https://msdn.microsoft.com/azure/stream-analytics/reference/tumbling-window-azure-stream-analytics), [**Hopping**](https://msdn.microsoft.com/azure/stream-analytics/reference/hopping-window-azure-stream-analytics), [**Sliding**](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics), and [**Session**](https://msdn.microsoft.com/azure/stream-analytics/reference/session-window-azure-stream-analytics) windows. You use the window functions in the [**GROUP BY**](https://msdn.microsoft.com/azure/stream-analytics/reference/group-by-azure-stream-analytics) clause of the query syntax in your Stream Analytics jobs. You can also aggregate events over multiple windows using the [**Windows()** function](https://docs.microsoft.com/stream-analytics-query/windows-azure-stream-analytics).
1717

1818
All the [windowing](https://msdn.microsoft.com/azure/stream-analytics/reference/windowing-azure-stream-analytics) operations output results at the **end** of the window. The output of the window will be single event based on the aggregate function used. The output event will have the time stamp of the end of the window and all window functions are defined with a fixed length.
1919

‎articles/stream-analytics/stream-analytics-with-azure-functions.md

Lines changed: 16 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,14 @@ ms.service: stream-analytics
77
ms.topic: tutorial
88
ms.custom: mvc
99
ms.workload: data-services
10-
ms.date: 04/09/2018
10+
ms.date: 06/05/2019
1111
ms.author: mamccrea
1212
ms.reviewer: jasonh
1313

1414
#Customer intent: "As an IT admin/developer I want to run Azure Functions with Stream Analytics jobs."
1515
---
1616

17-
# Run Azure Functions from Azure Stream Analytics jobs
17+
# Tutorial: Run Azure Functions from Azure Stream Analytics jobs
1818

1919
You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the output sinks to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
2020

@@ -23,9 +23,10 @@ Stream Analytics invokes Functions through HTTP triggers. The Functions output a
2323
In this tutorial, you learn how to:
2424

2525
> [!div class="checklist"]
26-
> * Create a Stream Analytics job
27-
> * Create an Azure function
28-
> * Configure Azure function as output to your job
26+
> * Create and run a Stream Analytics job
27+
> * Create an Azure Cache for Redis instance
28+
> * Create an Azure Function
29+
> * Check Azure Cache for Redis for results
2930
3031
If you don’t have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
3132

@@ -35,16 +36,9 @@ This section demonstrates how to configure a Stream Analytics job to run a funct
3536

3637
![Diagram showing relationships among the Azure services](./media/stream-analytics-with-azure-functions/image1.png)
3738

38-
The following steps are required to achieve this task:
39-
* [Create a Stream Analytics job with Event Hubs as input](#create-a-stream-analytics-job-with-event-hubs-as-input)
40-
* Create an Azure Cache for Redis instance
41-
* Create a function in Azure Functions that can write data to the Azure Cache for Redis
42-
* [Update the Stream Analytics job with the function as output](#update-the-stream-analytics-job-with-the-function-as-output)
43-
* Check Azure Cache for Redis for results
44-
4539
## Create a Stream Analytics job with Event Hubs as input
4640

47-
Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial to create an event hub, start the event generator application, and create a Stream Analytics job. (Skip the steps to create the query and the output. Instead, see the following sections to set up Functions output.)
41+
Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial to create an event hub, start the event generator application, and create a Stream Analytics job. Skip the steps to create the query and the output. Instead, see the following sections to set up an Azure Functions output.
4842

4943
## Create an Azure Cache for Redis instance
5044

@@ -58,7 +52,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
5852

5953
1. See the [Create a function app](../azure-functions/functions-create-first-azure-function.md#create-a-function-app) section of the Functions documentation. This walks you through how to create a function app and an [HTTP-triggered function in Azure Functions](../azure-functions/functions-create-first-azure-function.md#create-function), by using the CSharp language.
6054

61-
2. Browse to the **run.csx** function. Update it with the following code. (Make sure to replace "\<your Azure Cache for Redis connection string goes here\>" with the Azure Cache for Redis primary connection string that you retrieved in the previous section.)
55+
2. Browse to the **run.csx** function. Update it with the following code. Replace **"\<your Azure Cache for Redis connection string goes here\>"** with the Azure Cache for Redis primary connection string that you retrieved in the previous section.
6256

6357
```csharp
6458
using System;
@@ -109,7 +103,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
109103

110104
```
111105

112-
When Stream Analytics receives the "HTTP Request Entity Too Large" exception from the function, it reduces the size of the batches it sends to Functions. In your function, use the following code to check that Stream Analytics doesnt send oversized batches. Make sure that the maximum batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.
106+
When Stream Analytics receives the "HTTP Request Entity Too Large" exception from the function, it reduces the size of the batches it sends to Functions. The following code ensures that Stream Analytics doesn't send oversized batches. Make sure that the maximum batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.
113107

114108
```csharp
115109
if (dataArray.ToString().Length > 262144)
@@ -118,7 +112,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
118112
}
119113
```
120114

121-
3. In a text editor of your choice, create a JSON file named **project.json**. Use the following code, and save it on your local computer. This file contains the NuGet package dependencies required by the C# function.
115+
3. In a text editor of your choice, create a JSON file named **project.json**. Paste the following code, and save it on your local computer. This file contains the NuGet package dependencies required by the C# function.
122116

123117
```json
124118
{
@@ -142,8 +136,6 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
142136

143137
![Screenshot of App Service Editor](./media/stream-analytics-with-azure-functions/image4.png)
144138

145-
146-
147139
## Update the Stream Analytics job with the function as output
148140

149141
1. Open your Stream Analytics job on the Azure portal.
@@ -160,9 +152,9 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
160152
|Max Batch Count|Specifies the maximum number of events in each batch that is sent to the function. The default value is 100. This property is optional.|
161153
|Key|Allows you to use a function from another subscription. Provide the key value to access your function. This property is optional.|
162154

163-
3. Provide a name for the output alias. In this tutorial, we name it **saop1** (you can use any name of your choice). Fill in other details.
155+
3. Provide a name for the output alias. In this tutorial, it is named **saop1**, but you can use any name of your choice. Fill in other details.
164156

165-
4. Open your Stream Analytics job, and update the query to the following. (Make sure to replace the "saop1" text if you have named the output sink differently.)
157+
4. Open your Stream Analytics job, and update the query to the following. If you did not name your output sink **saop1**, remember to change it in the query.
166158

167159
```sql
168160
SELECT
@@ -175,9 +167,11 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
175167
WHERE CS1.SwitchNum != CS2.SwitchNum
176168
```
177169

178-
5. Start the telcodatagen.exe application by running the following command in command line (use the format `telcodatagen.exe [#NumCDRsPerHour] [SIM Card Fraud Probability] [#DurationHours]`):
170+
5. Start the telcodatagen.exe application by running the following command in command line. The command uses the format `telcodatagen.exe [#NumCDRsPerHour] [SIM Card Fraud Probability] [#DurationHours]`.
179171

180-
**telcodatagen.exe 1000 .2 2**
172+
```cmd
173+
telcodatagen.exe 1000 0.2 2
174+
```
181175

182176
6. Start the Stream Analytics job.
183177

0 commit comments

Comments
 (0)
Please sign in to comment.