You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-add-inputs.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -4,12 +4,12 @@ description: This article describe the concept of inputs in an Azure Stream Anal
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 04/25/2018
10
+
ms.date: 06/11/2019
12
11
---
12
+
13
13
# Understand inputs for Azure Stream Analytics
14
14
15
15
Azure Stream Analytics jobs connect to one or more data inputs. Each input defines a connection to an existing data source. Stream Analytics accepts data incoming from several kinds of event sources including Event Hubs, IoT Hub, and Blob storage. The inputs are referenced by name in the streaming SQL query that you write for each job. In the query, you can join multiple inputs to blend data or compare streaming data with a lookup to reference data, and pass the results to outputs.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-get-started-with-azure-stream-analytics-to-process-data-from-iot-devices.md
+5-3Lines changed: 5 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -7,17 +7,19 @@ ms.author: mamccrea
7
7
ms.reviewer: jasonh
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 12/06/2018
11
-
ms.custom: seodec18
10
+
ms.date: 06/11/2019
12
11
---
13
12
# Get started with Azure Stream Analytics to process data from IoT devices
14
-
In this tutorial, you will learn how to create stream-processing logic to gather data from Internet of Things (IoT) devices. We will use a real-world, Internet of Things (IoT) use case to demonstrate how to build your solution quickly and economically.
13
+
14
+
In this tutorial, you learn how to create stream-processing logic to gather data from Internet of Things (IoT) devices. We will use a real-world, Internet of Things (IoT) use case to demonstrate how to build your solution quickly and economically.
* Sample query and data files downloadable from [GitHub](https://aka.ms/azure-stream-analytics-get-started-iot)
19
20
20
21
## Scenario
22
+
21
23
Contoso, which is a company in the industrial automation space, has completely automated its manufacturing process. The machinery in this plant has sensors that are capable of emitting streams of data in real time. In this scenario, a production floor manager wants to have real-time insights from the sensor data to look for patterns and take actions on them. We will use the Stream Analytics Query Language (SAQL) over the sensor data to find interesting patterns from the incoming stream of data.
22
24
23
25
Here data is being generated from a Texas Instruments sensor tag device. The payload of the data is in JSON format and looks like the following:
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-power-bi-dashboard.md
+39-51Lines changed: 39 additions & 51 deletions
Original file line number
Diff line number
Diff line change
@@ -7,10 +7,10 @@ ms.author: jeanb
7
7
ms.reviewer: mamccrea
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 12/07/2018
11
-
ms.custom: seodec18
10
+
ms.date: 06/11/2019
12
11
---
13
-
# Tutorial: Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
12
+
# Stream Analytics and Power BI: A real-time analytics dashboard for streaming data
13
+
14
14
Azure Stream Analytics enables you to take advantage of one of the leading business intelligence tools, [Microsoft Power BI](https://powerbi.com/). In this article, you learn how create business intelligence tools by using Power BI as an output for your Azure Stream Analytics jobs. You also learn how to create and use a real-time dashboard.
15
15
16
16
This article continues from the Stream Analytics [real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial. It builds on the workflow created in that tutorial and adds a Power BI output so that you can visualize fraudulent phone calls that are detected by a Streaming Analytics job.
@@ -32,41 +32,31 @@ In the real-time fraud detection tutorial, the output is sent to Azure Blob stor
32
32
33
33
1. In the Azure portal, open the Streaming Analytics job that you created earlier. If you used the suggested name, the job is named `sa_frauddetection_job_demo`.
34
34
35
-
2. Select the **Outputs** box in the middle of the job dashboard and then select **+ Add**.
36
-
37
-
3. For **Output Alias**, enter `CallStream-PowerBI`. You can use a different name. If you do, make a note of it, because you need the name later.
38
-
39
-
4. Under **Sink**, select **Power BI**.
40
-
41
-

42
-
43
-
5. Click **Authorize**.
35
+
2. On the left menu, select **Outputs** under **Job topology**. Then, select **+ Add** and choose **Power BI** from the dropdown menu.
44
36
45
-
A window opens where you can provide your Azure credentials for a work or school account.
37
+
3. Select **+ Add** > **Power BI**. Then fill the form with the following details and select **Authorize**:
46
38
47
-

39
+
|**Setting**|**Suggested value**|
40
+
|---------|---------|
41
+
|Output alias | CallStream-PowerBI |
42
+
|Dataset name | sa-dataset |
43
+
|Table name | fraudulent-calls |
48
44
49
-
6. Enter your credentials. Be aware then when you enter your credentials, you're also giving permission to the Streaming Analytics job to access your Power BI area.
7. When you're returned to the **New output** blade, enter the following information:
47
+
> [!WARNING]
48
+
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
49
+
> We recommend that you do not explicitly create this dataset and table in your Power BI account. They are automatically created when you start your Stream Analytics job and the job starts pumping output into Power BI. If your job query doesn't return any results, the dataset and table are not created.
50
+
>
52
51
53
-
***Group Workspace**: Select a workspace in your Power BI tenant where you want to create the dataset.
54
-
***Dataset Name**: Enter `sa-dataset`. You can use a different name. If you do, make a note of it for later.
55
-
***Table Name**: Enter `fraudulent-calls`. Currently, Power BI output from Stream Analytics jobs can have only one table in a dataset.
56
-
57
-

58
-
59
-
> [!WARNING]
60
-
> If Power BI has a dataset and table that have the same names as the ones that you specify in the Stream Analytics job, the existing ones are overwritten.
61
-
> We recommend that you do not explicitly create this dataset and table in your Power BI account. They are automatically created when you start your Stream Analytics job and the job starts pumping output into Power BI. If your job query doesn't return any results, the dataset and table are not created.
62
-
>
52
+
4. When you select **Authorize**, a pop-up window opens and you are asked to provide credentials to authenticate to your Power BI account. Once the authorization is successful, **Save** the settings.
63
53
64
54
8. Click **Create**.
65
55
66
56
The dataset is created with the following settings:
67
57
68
-
***defaultRetentionPolicy: BasicFIFO**: Data is FIFO, with a maximum of 200,000 rows.
69
-
***defaultMode: pushStreaming**: The dataset supports both streaming tiles and traditional report-based visuals (a.k.a. push).
58
+
***defaultRetentionPolicy: BasicFIFO** - Data is FIFO, with a maximum of 200,000 rows.
59
+
***defaultMode: pushStreaming** - The dataset supports both streaming tiles and traditional report-based visuals (a.k.a. push).
70
60
71
61
Currently, you can't create datasets with other flags.
72
62
@@ -84,54 +74,52 @@ For more information about Power BI datasets, see the [Power BI REST API](https:
84
74
>[!NOTE]
85
75
>If you did not name the input `CallStream` in the fraud-detection tutorial, substitute your name for `CallStream` in the **FROM** and **JOIN** clauses in the query.
86
76
87
-
```SQL
88
-
/* Our criteria for fraud:
89
-
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
77
+
```SQL
78
+
/* Our criteria for fraud:
79
+
Calls made from the same caller to two phone switches in different locations (for example, Australia and Europe) within five seconds */
90
80
91
-
SELECT System.Timestamp AS WindowEnd, COUNT(*) AS FraudulentCalls
92
-
INTO "CallStream-PowerBI"
93
-
FROM "CallStream" CS1 TIMESTAMP BY CallRecTime
94
-
JOIN "CallStream" CS2 TIMESTAMP BY CallRecTime
81
+
SELECTSystem.TimestampAS WindowEnd, COUNT(*) AS FraudulentCalls
82
+
INTO "CallStream-PowerBI"
83
+
FROM"CallStream" CS1 TIMESTAMP BY CallRecTime
84
+
JOIN"CallStream" CS2 TIMESTAMP BY CallRecTime
95
85
96
-
/* Where the caller is the same, as indicated by IMSI (International Mobile Subscriber Identity) */
97
-
ON CS1.CallingIMSI = CS2.CallingIMSI
86
+
/* Where the caller is the same, as indicated by IMSI (International Mobile Subscriber Identity) */
87
+
ONCS1.CallingIMSI=CS2.CallingIMSI
98
88
99
-
/* ...and date between CS1 and CS2 is between one and five seconds */
100
-
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1 AND 5
89
+
/* ...and date between CS1 and CS2 is between one and five seconds */
90
+
AND DATEDIFF(ss, CS1, CS2) BETWEEN 1AND5
101
91
102
-
/* Where the switch location is different */
103
-
WHERE CS1.SwitchNum != CS2.SwitchNum
104
-
GROUP BY TumblingWindow(Duration(second, 1))
105
-
```
92
+
/* Where the switch location is different */
93
+
WHERECS1.SwitchNum!=CS2.SwitchNum
94
+
GROUP BY TumblingWindow(Duration(second, 1))
95
+
```
106
96
107
97
4. Click **Save**.
108
98
109
99
110
100
## Test the query
101
+
111
102
This section is optional, but recommended.
112
103
113
104
1. If the TelcoStreaming app is not currently running, start it by following these steps:
114
105
115
-
* Open a command window.
106
+
* Open Command Prompt.
116
107
* Go to the folder where the telcogenerator.exe and modified telcodatagen.exe.config files are.
117
108
* Run the following command:
118
109
119
110
`telcodatagen.exe 1000 .2 2`
120
111
121
-
2.In the **Query**blade, click the dots next to the `CallStream` input and then select **Sample data from input**.
112
+
2.On the **Query**page for your Stream Analytics job, click the dots next to the `CallStream` input and then select **Sample data from input**.
122
113
123
114
3. Specify that you want three minutes' worth of data and click **OK**. Wait until you're notified that the data has been sampled.
124
115
125
-
4. Click **Test** and make sure you're getting results.
126
-
116
+
4. Click **Test** and review the results.
127
117
128
118
## Run the job
129
119
130
-
1. Make sure that the TelcoStreaming app is running.
131
-
132
-
2. Close the **Query** blade.
120
+
1. Make sure the TelcoStreaming app is running.
133
121
134
-
3. In the job blade, click**Start**.
122
+
2. Navigate to the **Overview** page for your Stream Analytics job and select**Start**.
135
123
136
124

Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-quick-create-vs.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -4,10 +4,10 @@ description: This quickstart shows you how to get started by creating a Stream A
4
4
services: stream-analytics
5
5
author: mamccrea
6
6
ms.author: mamccrea
7
-
ms.date: 12/20/2018
7
+
ms.date: 06/11/2019
8
8
ms.topic: quickstart
9
9
ms.service: stream-analytics
10
-
ms.custom: mvc
10
+
11
11
#Customer intent: "As an IT admin/developer I want to create a Stream Analytics job, configure input and output & analyze data by using Visual Studio."
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-set-up-alerts.md
+4-5Lines changed: 4 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -7,14 +7,13 @@ ms.author: jeanb
7
7
ms.reviewer: mamccrea
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 02/05/2019
11
-
ms.custom: seodec18
10
+
ms.date: 06/11/2019
12
11
---
13
12
# Set up alerts for Azure Stream Analytics jobs
14
13
15
14
It's important to monitor your Azure Stream Analytics job to ensure the job is running continuously without any problems. This article describes how to set up alerts for common scenarios that should be monitored.
16
15
17
-
Rules can be set up on metrics through the portal and can be configured [programmatically](https://code.msdn.microsoft.com/windowsazure/Receive-Email-Notifications-199e2c9a) over Operation Logs data.
16
+
You can define rules on metrics from Operation Logs data through the portal, as well as [programmatically](https://code.msdn.microsoft.com/windowsazure/Receive-Email-Notifications-199e2c9a).
18
17
19
18
## Set up alerts in the Azure portal
20
19
@@ -24,15 +23,15 @@ The following example demonstrates how to set up alerts for when your job enters
24
23
25
24
2. On the **Job** page, navigate to the **Monitoring** section.
26
25
27
-
3. Select **Metrics**, and then click **New alert rule**.
26
+
3. Select **Metrics**, and then **New alert rule**.
4. Your Stream Analytics job name should automatically appear under **RESOURCE**. Click **Add condition**, and select **All Administrative operations** under **Configure signal logic**.
32
31
33
32

34
33
35
-
5. Under **Configure signal logic**, change **Event Level** to **All** and change **Status** to **Failed**. Leave **Event initiated by** blank and click**Done**.
34
+
5. Under **Configure signal logic**, change **Event Level** to **All** and change **Status** to **Failed**. Leave **Event initiated by** blank and select**Done**.
36
35
37
36

Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-use-reference-data.md
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -4,13 +4,13 @@ description: This article describes how to use reference data to lookup or corre
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
7
ms.reviewer: mamccrea
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 01/29/2019
10
+
ms.date: 06/11/2019
12
11
---
13
12
# Using reference data for lookups in Stream Analytics
13
+
14
14
Reference data (also known as a lookup table) is a finite data set that is static or slowly changing in nature, used to perform a lookup or to correlate with your data stream. For example, in an IoT scenario, you could store metadata about sensors (which don’t change often) in reference data and join it with real time IoT data streams. Azure Stream Analytics loads reference data in memory to achieve low latency stream processing. To make use of reference data in your Azure Stream Analytics job, you will generally use a [Reference Data Join](https://msdn.microsoft.com/library/azure/dn949258.aspx) in your query.
15
15
16
16
Stream Analytics supports Azure Blob storage and Azure SQL Database as the storage layer for Reference Data. You can also transform and/or copy reference data to Blob storage from Azure Data Factory to use [any number of cloud-based and on-premises data stores](../data-factory/copy-activity-overview.md).
@@ -37,7 +37,7 @@ To configure your reference data, you first need to create an input that is of t
37
37
38
38
### Static reference data
39
39
40
-
If your reference data is not expected to change, then support for static reference data is enabled by specifying a static path in the input configuration. Azure Stream Analytics picks up the blob from the specified path. {date} and {time} substitution tokens aren't required. Reference data is immutable in Stream Analytics. Therefore, overwriting a static reference data blob is not recommended.
40
+
If your reference data is not expected to change, then support for static reference data is enabled by specifying a static path in the input configuration. Azure Stream Analytics picks up the blob from the specified path. {date} and {time} substitution tokens aren't required. Because reference data is immutable in Stream Analytics, overwriting a static reference data blob is not recommended.
41
41
42
42
### Generate reference data on a schedule
43
43
@@ -48,7 +48,7 @@ Azure Stream Analytics automatically scans for refreshed reference data blobs at
48
48
> [!NOTE]
49
49
> Currently Stream Analytics jobs look for the blob refresh only when the machine time advances to the time encoded in the blob name. For example, the job will look for `sample/2015-04-16/17-30/products.csv` as soon as possible but no earlier than 5:30 PM on April 16th, 2015 UTC time zone. It will *never* look for a blob with an encoded time earlier than the last one that is discovered.
50
50
>
51
-
> E.g. once the job finds the blob `sample/2015-04-16/17-30/products.csv` it will ignore any files with an encoded date earlier than 5:30 PM April 16th, 2015 so if a late arriving `sample/2015-04-16/17-25/products.csv` blob gets created in the same container the job will not use it.
51
+
> For example, once the job finds the blob `sample/2015-04-16/17-30/products.csv` it will ignore any files with an encoded date earlier than 5:30 PM April 16th, 2015 so if a late arriving `sample/2015-04-16/17-25/products.csv` blob gets created in the same container the job will not use it.
52
52
>
53
53
> Likewise if `sample/2015-04-16/17-30/products.csv` is only produced at 10:03 PM April 16th, 2015 but no blob with an earlier date is present in the container, the job will use this file starting at 10:03 PM April 16th, 2015 and use the previous reference data until then.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-window-functions.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -4,16 +4,16 @@ description: This article describes four windowing functions (tumbling, hopping,
4
4
services: stream-analytics
5
5
author: jseb225
6
6
ms.author: jeanb
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 05/07/2018
10
+
ms.date: 06/11/2019
12
11
---
13
12
# Introduction to Stream Analytics windowing functions
13
+
14
14
In time-streaming scenarios, performing operations on the data contained in temporal windows is a common pattern. Stream Analytics has native support for windowing functions, enabling developers to author complex stream processing jobs with minimal effort.
15
15
16
-
There are four kinds of temporal windows to choose from: [**Tumbling**](https://msdn.microsoft.com/azure/stream-analytics/reference/tumbling-window-azure-stream-analytics), [**Hopping**](https://msdn.microsoft.com/azure/stream-analytics/reference/hopping-window-azure-stream-analytics), [**Sliding**](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics), and [**Session**](https://msdn.microsoft.com/azure/stream-analytics/reference/session-window-azure-stream-analytics) windows. You use the window functions in the [**GROUP BY**](https://msdn.microsoft.com/azure/stream-analytics/reference/group-by-azure-stream-analytics) clause of the query syntax in your Stream Analytics jobs.
16
+
There are four kinds of temporal windows to choose from: [**Tumbling**](https://msdn.microsoft.com/azure/stream-analytics/reference/tumbling-window-azure-stream-analytics), [**Hopping**](https://msdn.microsoft.com/azure/stream-analytics/reference/hopping-window-azure-stream-analytics), [**Sliding**](https://msdn.microsoft.com/azure/stream-analytics/reference/sliding-window-azure-stream-analytics), and [**Session**](https://msdn.microsoft.com/azure/stream-analytics/reference/session-window-azure-stream-analytics) windows. You use the window functions in the [**GROUP BY**](https://msdn.microsoft.com/azure/stream-analytics/reference/group-by-azure-stream-analytics) clause of the query syntax in your Stream Analytics jobs. You can also aggregate events over multiple windows using the [**Windows()** function](https://docs.microsoft.com/stream-analytics-query/windows-azure-stream-analytics).
17
17
18
18
All the [windowing](https://msdn.microsoft.com/azure/stream-analytics/reference/windowing-azure-stream-analytics) operations output results at the **end** of the window. The output of the window will be single event based on the aggregate function used. The output event will have the time stamp of the end of the window and all window functions are defined with a fixed length.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-with-azure-functions.md
+16-22Lines changed: 16 additions & 22 deletions
Original file line number
Diff line number
Diff line change
@@ -7,14 +7,14 @@ ms.service: stream-analytics
7
7
ms.topic: tutorial
8
8
ms.custom: mvc
9
9
ms.workload: data-services
10
-
ms.date: 04/09/2018
10
+
ms.date: 06/05/2019
11
11
ms.author: mamccrea
12
12
ms.reviewer: jasonh
13
13
14
14
#Customer intent: "As an IT admin/developer I want to run Azure Functions with Stream Analytics jobs."
15
15
---
16
16
17
-
# Run Azure Functions from Azure Stream Analytics jobs
17
+
# Tutorial: Run Azure Functions from Azure Stream Analytics jobs
18
18
19
19
You can run Azure Functions from Azure Stream Analytics by configuring Functions as one of the output sinks to the Stream Analytics job. Functions are an event-driven, compute-on-demand experience that lets you implement code that is triggered by events occurring in Azure or third-party services. This ability of Functions to respond to triggers makes it a natural output to Stream Analytics jobs.
20
20
@@ -23,9 +23,10 @@ Stream Analytics invokes Functions through HTTP triggers. The Functions output a
23
23
In this tutorial, you learn how to:
24
24
25
25
> [!div class="checklist"]
26
-
> * Create a Stream Analytics job
27
-
> * Create an Azure function
28
-
> * Configure Azure function as output to your job
26
+
> * Create and run a Stream Analytics job
27
+
> * Create an Azure Cache for Redis instance
28
+
> * Create an Azure Function
29
+
> * Check Azure Cache for Redis for results
29
30
30
31
If you don’t have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
31
32
@@ -35,16 +36,9 @@ This section demonstrates how to configure a Stream Analytics job to run a funct
35
36
36
37

37
38
38
-
The following steps are required to achieve this task:
39
-
*[Create a Stream Analytics job with Event Hubs as input](#create-a-stream-analytics-job-with-event-hubs-as-input)
40
-
* Create an Azure Cache for Redis instance
41
-
* Create a function in Azure Functions that can write data to the Azure Cache for Redis
42
-
*[Update the Stream Analytics job with the function as output](#update-the-stream-analytics-job-with-the-function-as-output)
43
-
* Check Azure Cache for Redis for results
44
-
45
39
## Create a Stream Analytics job with Event Hubs as input
46
40
47
-
Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial to create an event hub, start the event generator application, and create a Stream Analytics job. (Skip the steps to create the query and the output. Instead, see the following sections to set up Functions output.)
41
+
Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detection.md) tutorial to create an event hub, start the event generator application, and create a Stream Analytics job. Skip the steps to create the query and the output. Instead, see the following sections to set up an Azure Functions output.
48
42
49
43
## Create an Azure Cache for Redis instance
50
44
@@ -58,7 +52,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
58
52
59
53
1. See the [Create a function app](../azure-functions/functions-create-first-azure-function.md#create-a-function-app) section of the Functions documentation. This walks you through how to create a function app and an [HTTP-triggered function in Azure Functions](../azure-functions/functions-create-first-azure-function.md#create-function), by using the CSharp language.
60
54
61
-
2. Browse to the **run.csx** function. Update it with the following code. (Make sure to replace "\<your Azure Cache for Redis connection string goes here\>" with the Azure Cache for Redis primary connection string that you retrieved in the previous section.)
55
+
2. Browse to the **run.csx** function. Update it with the following code. Replace **"\<your Azure Cache for Redis connection string goes here\>"** with the Azure Cache for Redis primary connection string that you retrieved in the previous section.
62
56
63
57
```csharp
64
58
usingSystem;
@@ -109,7 +103,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
109
103
110
104
```
111
105
112
-
When Stream Analytics receives the "HTTP Request Entity Too Large" exception from the function, it reduces the size of the batches it sends to Functions. In your function, use the following code to check that Stream Analytics doesn’t send oversized batches. Make sure that the maximum batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.
106
+
When Stream Analytics receives the "HTTP Request Entity Too Large" exception from the function, it reduces the size of the batches it sends to Functions. The following code ensures that Stream Analytics doesn't send oversized batches. Make sure that the maximum batch count and size values used in the function are consistent with the values entered in the Stream Analytics portal.
113
107
114
108
```csharp
115
109
if (dataArray.ToString().Length>262144)
@@ -118,7 +112,7 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
118
112
}
119
113
```
120
114
121
-
3. In a text editor of your choice, create a JSON file named **project.json**. Use the following code, and save it on your local computer. This file contains the NuGet package dependencies required by the C# function.
115
+
3. In a text editor of your choice, create a JSON file named **project.json**. Paste the following code, and save it on your local computer. This file contains the NuGet package dependencies required by the C# function.
122
116
123
117
```json
124
118
{
@@ -142,8 +136,6 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
142
136
143
137

144
138
145
-
146
-
147
139
## Update the Stream Analytics job with the function as output
148
140
149
141
1. Open your Stream Analytics job on the Azure portal.
@@ -160,9 +152,9 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
160
152
|Max Batch Count|Specifies the maximum number of events in each batch that is sent to the function. The default value is 100. This property is optional.|
161
153
|Key|Allows you to use a function from another subscription. Provide the key value to access your function. This property is optional.|
162
154
163
-
3. Provide a name for the output alias. In this tutorial, we name it**saop1** (you can use any name of your choice). Fill in other details.
155
+
3. Provide a name for the output alias. In this tutorial, it is named**saop1**, but you can use any name of your choice. Fill in other details.
164
156
165
-
4. Open your Stream Analytics job, and update the query to the following. (Make sure to replace the "saop1" text if you have named the output sink differently.)
157
+
4. Open your Stream Analytics job, and update the query to the following. If you did not name your output sink **saop1**, remember to change it in the query.
166
158
167
159
```sql
168
160
SELECT
@@ -175,9 +167,11 @@ Follow the [Real-time fraud detection](stream-analytics-real-time-fraud-detectio
175
167
WHERECS1.SwitchNum!=CS2.SwitchNum
176
168
```
177
169
178
-
5. Start the telcodatagen.exe application by running the following command in command line (use the format `telcodatagen.exe [#NumCDRsPerHour] [SIM Card Fraud Probability] [#DurationHours]`):
170
+
5. Start the telcodatagen.exe application by running the following command in command line. The command uses the format `telcodatagen.exe [#NumCDRsPerHour] [SIM Card Fraud Probability] [#DurationHours]`.
0 commit comments