Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit b7b2144

Browse files
committedDec 7, 2018
seo updates
1 parent f1c3602 commit b7b2144

22 files changed

+418
-395
lines changed
 

‎articles/stream-analytics/stream-analytics-high-frequency-trading.md

Lines changed: 365 additions & 350 deletions
Large diffs are not rendered by default.

‎articles/stream-analytics/stream-analytics-how-to-configure-azure-machine-learning-endpoints-in-stream-analytics.md

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,10 @@ description: This article describes how to use Machine Language user defined fun
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 03/28/2017
10+
ms.date: 12/07/2018
1211
---
1312
# Machine Learning integration in Stream Analytics
1413
Stream Analytics supports user-defined functions that call out to Azure Machine Learning endpoints. REST API support for this feature is detailed in the [Stream Analytics REST API library](https://msdn.microsoft.com/library/azure/dn835031.aspx). This article provides supplemental information needed for successful implementation of this capability in Stream Analytics. A tutorial has also been posted and is available [here](stream-analytics-machine-learning-integration-tutorial.md).
@@ -45,7 +44,7 @@ As an example, the following sample code creates a scalar UDF named *newudf* tha
4544

4645
Example request body:
4746

48-
````
47+
```json
4948
{
5049
"name": "newudf",
5150
"properties": {
@@ -61,7 +60,7 @@ Example request body:
6160
}
6261
}
6362
}
64-
````
63+
```
6564

6665
## Call RetrieveDefaultDefinition endpoint for default UDF
6766
Once the skeleton UDF is created the complete definition of the UDF is needed. The RetreiveDefaultDefinition endpoint helps you get the default definition for a scalar function that is bound to an Azure Machine Learning endpoint. The payload below requires you to get the default UDF definition for a scalar function that is bound to an Azure Machine Learning endpoint. It doesn’t specify the actual endpoint as it has already been provided during PUT request. Stream Analytics calls the endpoint provided in the request if it is provided explicitly. Otherwise it uses the one originally referenced. Here the UDF takes a single string parameter (a sentence) and returns a single output of type string which indicates the "sentiment" label for that sentence.
@@ -72,19 +71,19 @@ POST : /subscriptions/<subscriptionId>/resourceGroups/<resourceGroup>/providers/
7271

7372
Example request body:
7473

75-
````
74+
```json
7675
{
7776
"bindingType": "Microsoft.MachineLearning/WebService",
7877
"bindingRetrievalProperties": {
7978
"executeEndpoint": null,
8079
"udfType": "Scalar"
8180
}
8281
}
83-
````
82+
```
8483

8584
A sample output of this would look something like below.
8685

87-
````
86+
```json
8887
{
8988
"name": "newudf",
9089
"properties": {
@@ -120,7 +119,7 @@ A sample output of this would look something like below.
120119
}
121120
}
122121
}
123-
````
122+
```
124123

125124
## Patch UDF with the response
126125
Now the UDF must be patched with the previous response, as shown below.
@@ -131,7 +130,7 @@ PATCH : /subscriptions/<subscriptionId>/resourceGroups/<resourceGroup>/providers
131130

132131
Request Body (Output from RetrieveDefaultDefinition):
133132

134-
````
133+
```json
135134
{
136135
"name": "newudf",
137136
"properties": {
@@ -167,20 +166,20 @@ Request Body (Output from RetrieveDefaultDefinition):
167166
}
168167
}
169168
}
170-
````
169+
```
171170

172171
## Implement Stream Analytics transformation to call the UDF
173172
Now query the UDF (here named scoreTweet) for every input event and write a response for that event to an output.
174173

175-
````
174+
```json
176175
{
177176
"name": "transformation",
178177
"properties": {
179178
"streamingUnits": null,
180179
"query": "select *,scoreTweet(Tweet) TweetSentiment into blobOutput from blobInput"
181180
}
182181
}
183-
````
182+
```
184183

185184

186185
## Get help

‎articles/stream-analytics/stream-analytics-introduction.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,18 @@ description: Learn about Stream Analytics, a managed service that helps you anal
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: overview
1110
ms.workload: data-services
1211
ms.custom: mvc
13-
ms.date: 03/27/2018
12+
ms.date: 12/07/2018
13+
ms.custom: seodec18
1414
#Customer intent: "What is Azure Stream Analytics and why should I care? As a IT Pro or developer, how do I use Stream Analytics to perform analytics on data streams?".
1515

1616
---
1717

18-
# What is Stream Analytics?
18+
# What is Azure Stream Analytics?
1919

2020
Azure Stream Analytics is an event-processing engine that allows you to examine high volumes of data streaming from devices. Incoming data can be from devices, sensors, web sites, social media feeds, applications, and more. It also supports extracting information from data streams, identifying patterns, and relationships. You can then use these patterns to trigger other actions downstream, like alerts, feed information to a reporting tool, or store it for later use.
2121

@@ -39,13 +39,13 @@ After analyzing the incoming data, you specify an output for the transformed dat
3939

4040
The following image illustrates the Stream Analytics pipeline, Your Stream Analytics job can use all or a selected set of inputs and outputs. This image shows how data is sent to Stream Analytics, analyzed, and sent for other actions like storage, or presentation:
4141

42-
![Stream Analytics pipeline](./media/stream-analytics-introduction/stream_analytics_intro_pipeline.png)
42+
![Stream Analytics intro pipeline](./media/stream-analytics-introduction/stream-analytics-intro-pipeline.png)
4343

4444
## Key capabilities and benefits
4545

4646
Azure Stream Analytics is designed to be easy to use, flexible, reliable, and scalable to any job size. It is available across multiple datacenters as well as sovereign clouds. Following image illustrates the key capabilities of Azure Stream Analytics:
4747

48-
![Stream Analytics key capabilities](./media/stream-analytics-introduction/stream_analytics_key_capabilities.png)
48+
![Stream Analytics key capabilities](./media/stream-analytics-introduction/stream-analytics-key-capabilities.png)
4949

5050
## Ease of getting started
5151

‎articles/stream-analytics/stream-analytics-javascript-user-defined-functions.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ To create a simple JavaScript user-defined function under an existing Stream Ana
5757
4. On the **New Function** blade, for **Function Type**, select **JavaScript**. A default function template appears in the editor.
5858
5. For the **UDF alias**, enter **hex2Int**, and change the function implementation as follows:
5959

60-
```
60+
```javascript
6161
// Convert Hex value to integer.
6262
function hex2Int(hexValue) {
6363
return parseInt(hexValue, 16);
@@ -72,7 +72,7 @@ To create a simple JavaScript user-defined function under an existing Stream Ana
7272
1. In the query editor, under **JOB TOPOLOGY**, select **Query**.
7373
2. Edit your query, and then call the user-defined function, like this:
7474

75-
```
75+
```SQL
7676
SELECT
7777
time,
7878
UDF.hex2Int(offset) AS IntOffset
@@ -128,14 +128,14 @@ If you have a follow-up processing step that uses a Stream Analytics job output
128128

129129
**JavaScript user-defined function definition:**
130130

131-
```
131+
```javascript
132132
function main(x) {
133133
return JSON.stringify(x);
134134
}
135135
```
136136

137137
**Sample query:**
138-
```
138+
```SQL
139139
SELECT
140140
DataString,
141141
DataValue,

‎articles/stream-analytics/stream-analytics-job-diagnostic-logs.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ description: This article describes how to analyze diagnostics logs in Azure Str
44
services: stream-analytics
55
author: jseb225
66
ms.author: jeanb
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 04/20/2017
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313
# Troubleshoot Azure Stream Analytics by using diagnostics logs
1414

@@ -30,23 +30,23 @@ Diagnostics logs are **off** by default. To turn on diagnostics logs, complete t
3030

3131
1. Sign in to the Azure portal, and go to the streaming job blade. Under **Monitoring**, select **Diagnostics logs**.
3232

33-
![Blade navigation to diagnostics logs](./media/stream-analytics-job-diagnostic-logs/image1.png)
33+
![Blade navigation to diagnostics logs](./media/stream-analytics-job-diagnostic-logs/diagnostic-logs-monitoring.png)
3434

3535
2. Select **Turn on diagnostics**.
3636

37-
![Turn on diagnostics logs](./media/stream-analytics-job-diagnostic-logs/image2.png)
37+
![Turn on Stream Analytics diagnostics logs](./media/stream-analytics-job-diagnostic-logs/turn-on-diagnostic-logs.png)
3838

3939
3. On the **Diagnostics settings** page, for **Status**, select **On**.
4040

41-
![Change status for diagnostics logs](./media/stream-analytics-job-diagnostic-logs/image3.png)
41+
![Change status for diagnostics logs](./media/stream-analytics-job-diagnostic-logs/save-diagnostic-log-settings.png)
4242

4343
4. Set up the archival target (storage account, event hub, Log Analytics) that you want. Then, select the categories of logs that you want to collect (Execution, Authoring).
4444

4545
5. Save the new diagnostics configuration.
4646

4747
The diagnostics configuration takes about 10 minutes to take effect. After that, the logs start appearing in the configured archival target (you can see these on the **Diagnostics logs** page):
4848

49-
![Blade navigation to diagnostics logs - archival targets](./media/stream-analytics-job-diagnostic-logs/image4.png)
49+
![Blade navigation to diagnostics logs - archival targets](./media/stream-analytics-job-diagnostic-logs/view-diagnostic-logs-page.png)
5050

5151
For more information about configuring diagnostics, see [Collect and consume diagnostics data from your Azure resources](https://docs.microsoft.com/azure/monitoring-and-diagnostics/monitoring-overview-of-diagnostic-logs).
5252

‎articles/stream-analytics/stream-analytics-job-reliability.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@ title: Avoid service interruptions in Azure Stream Analytics jobs
33
description: This article describes guidance on making your Stream Analytics jobs upgrade resilient.
44
services: stream-analytics
55
author: jseb225
6-
manager: kfile
76
ms.author: jeanb
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 03/28/2017
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313

1414
# Guarantee Stream Analytics job reliability during service updates
@@ -24,3 +24,11 @@ _With the exception of Central India_ (whose paired region, South India, does no
2424
The article on **[availability and paired regions](https://docs.microsoft.com/azure/best-practices-availability-paired-regions)** has the most up-to-date information on which regions are paired.
2525

2626
Customers are advised to deploy identical jobs to both paired regions. In addition to Stream Analytics internal monitoring capabilities, customers are also advised to monitor the jobs as if **both** are production jobs. If a break is identified to be a result of the Stream Analytics service update, escalate appropriately and fail over any downstream consumers to the healthy job output. Escalation to support will prevent the paired region from being affected by the new deployment and maintain the integrity of the paired jobs.
27+
28+
## Next steps
29+
30+
* [Introduction to Stream Analytics](stream-analytics-introduction.md)
31+
* [Get started with Stream Analytics](stream-analytics-real-time-fraud-detection.md)
32+
* [Scale Stream Analytics jobs](stream-analytics-scale-jobs.md)
33+
* [Stream Analytics query language reference](https://msdn.microsoft.com/library/azure/dn834998.aspx)
34+
* [Stream Analytics management REST API reference](https://msdn.microsoft.com/library/azure/dn835031.aspx)

‎articles/stream-analytics/stream-analytics-live-data-local-testing.md

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,14 @@
11
---
2-
title: Test live data locally using Azure Stream Analytics tools for Visual Studio (Preview)
2+
title: Test live data with Azure Stream Analytics for Visual Studio
33
description: Learn how to test your Azure Stream Analytics job locally using live streaming data.
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
77
ms.reviewer: mamccrea
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 09/24/2018
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1112
---
1213

1314
# Test live data locally using Azure Stream Analytics tools for Visual Studio (Preview)
@@ -28,28 +29,28 @@ The following local testing options are supported:
2829

2930
1. After you've created an [Azure Stream Analytics cloud project in Visual Studio](stream-analytics-quick-create-vs.md), open **script.asaql**. The local testing uses local input and local output by default.
3031

31-
![Azure Stream Analytics Visual Studio local testing with local input and local output](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-local-input-output.png)
32+
![Azure Stream Analytics Visual Studio local input and local output](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-local-input-output.png)
3233

3334
2. To test live data, choose **Use Cloud Input** from the dropdown box.
3435

35-
![Azure Stream Analytics Visual Studio local testing with live cloud input](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input.png)
36+
![Azure Stream Analytics Visual Studio live cloud input](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input.png)
3637

3738

3839
3. Set the **Start Time** to define when the job will start processing input data. The job might need to read input data ahead of time to ensure accurate results. The default time is set to 30 minutes ahead of the current time.
3940

40-
![Azure Stream Analytics Visual Studio local testing with live data start time](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-start-time.png)
41+
![Azure Stream Analytics Visual Studio live data start time](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-start-time.png)
4142

4243
4. Click **Run Locally**. A console window will appear with the running progress and job metrics. If you want to stop the process, you can do so manually.
4344

44-
![Azure Stream Analytics Visual Studio local testing with live data process window](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-process-window.png)
45+
![Azure Stream Analytics Visual Studio live data process window](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-process-window.png)
4546

4647
The output results are refreshed every three seconds with the first 500 output rows in the local run result window, and the output files are placed in your project path **ASALocalRun** folder. You can also open the output files by clicking **Open Results Folder** button in the local run result window.
4748

48-
![Azure Stream Analytics Visual Studio local testing with live data open results folder](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-open-results-folder.png)
49+
![Azure Stream Analytics Visual Studio live data open results folder](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-input-open-results-folder.png)
4950

5051
5. If you want to output the results to your cloud output sinks, choose **Output to Cloud** from the second dropdown box. Power BI and Azure Data Lake Storage are not supported output sinks.
5152

52-
![Azure Stream Analytics Visual Studio local testing with live data output to cloud](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-output.png)
53+
![Azure Stream Analytics Visual Studio live data output to cloud](./media/stream-analytics-live-data-local-testing/stream-analytics-local-testing-cloud-output.png)
5354

5455
## Limitations
5556

‎articles/stream-analytics/stream-analytics-login-credentials-inputs-outputs.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,13 @@
22
title: Rotate login credentials in Azure Stream Analytics jobs
33
description: This article describes how to update the credentials of inputs and output sinks in Azure Stream Analytics jobs.
44
services: stream-analytics
5-
author: jasonwhowell
5+
author: mamccrea
66
ms.author: mamccrea
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 01/11/2018
10+
ms.date: 12/07/2018
11+
ms.custom: seodec18
1212
---
1313
# Rotate login credentials for inputs and outputs of a Stream Analytics Job
1414

@@ -21,20 +21,20 @@ In this section, we will walk you through regenerating credentials for Blob Stor
2121
### Blob storage/Table storage
2222
1. Sign in to the Azure portal > browse the storage account that you used as input/output for the Stream Analytics job.
2323
2. From the settings section, open **Access keys**. Between the two default keys (key1, key2), pick the one that is not used by your job and regenerate it:
24-
![Regenerate keys for storage account](media/stream-analytics-login-credentials-inputs-outputs/image1.png)
24+
![Regenerate keys for storage account](media/stream-analytics-login-credentials-inputs-outputs/regenerate-storage-keys.png)
2525
3. Copy the newly generated key.
2626
4. From the Azure portal, browse your Stream Analytics job > select **Stop** and wait for the job to stop.
2727
5. Locate the Blob/Table storage input/output for which you want to update credentials.
2828
6. Find the **Storage Account Key** field and paste your newly generated key > click **Save**.
2929
7. A connection test will automatically start when you save your changes, you can view it from the notifications tab. There are two notifications- one corresponds to saving the update and other corresponds to testing the connection:
30-
![Notifications after editing the key](media/stream-analytics-login-credentials-inputs-outputs/image4.png)
30+
![Notifications after editing the key](media/stream-analytics-login-credentials-inputs-outputs/edited-key-notifications.png)
3131
8. Proceed to [start your job from the last stopped time](#start-your-job-from-the-last-stopped-time) section.
3232

3333
### Event hubs
3434

3535
1. Sign in to the Azure portal > browse the Event Hub that you used as input/output for the Stream Analytics job.
3636
2. From the settings section, open **Shared access policies** and select the required access policy. Between the **Primary Key** and **Secondary Key**, pick the one that is not used by your job and regenerate it:
37-
![Regenerate keys for Event Hub](media/stream-analytics-login-credentials-inputs-outputs/image2.png)
37+
![Regenerate keys for Event Hubs](media/stream-analytics-login-credentials-inputs-outputs/regenerate-event-hub-keys.png)
3838
3. Copy the newly generated key.
3939
4. From the Azure portal, browse your Stream Analytics job > select **Stop** and wait for the job to stop.
4040
5. Locate the Event hubs input/output for which you want to update credentials.
@@ -48,7 +48,7 @@ You need to connect to the SQL database to update the login credentials of an ex
4848

4949
1. Sign in to the Azure portal > browse the SQL database that you used as output for the Stream Analytics job.
5050
2. From **Data explorer**, login/connect to your database > select Authorization type as **SQL server authentication** > type in your **Login** and **Password** details > Select **Ok**.
51-
![Regenerate credentials for SQL database](media/stream-analytics-login-credentials-inputs-outputs/image3.png)
51+
![Regenerate credentials for SQL database](media/stream-analytics-login-credentials-inputs-outputs/regenerate-sql-credentials.png)
5252

5353
3. In the query tab, alter the password for one of your user's by running the following query (make sure to replace `<user_name>` with your username and `<new_password>` with your new password):
5454

@@ -73,7 +73,7 @@ You need to connect to the SQL database to update the login credentials of an ex
7373

7474
1. Navigate to the job's **Overview** pane > select **Start** to start the job.
7575
2. Select **When last stopped** > click **Start**. Note that the "When last stopped" option only appears if you previously ran the job and had some output generated. The job is restarted based on the last output value's time.
76-
![Start the job](media/stream-analytics-login-credentials-inputs-outputs/image5.png)
76+
![Start the Stream Analytics job](media/stream-analytics-login-credentials-inputs-outputs/start-stream-analytics-job.png)
7777

7878
## Next steps
7979
* [Introduction to Azure Stream Analytics](stream-analytics-introduction.md)

0 commit comments

Comments
 (0)
Please sign in to comment.