Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 11ae563

Browse files
committedDec 6, 2018
seo fixes
1 parent 00923c6 commit 11ae563

13 files changed

+49
-46
lines changed
 

‎articles/stream-analytics/stream-analytics-clean-up-your-job.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
2-
title: Clean up your Azure Stream Analytics job
3-
description: This article is a guide for how to delete Azure Stream Analytics jobs.
2+
title: Clean up your Azure Stream Analytics job | Microsoft Docs
3+
description: This article shows you different methods for deleting your Azure Stream Analytics jobs.
44
services: stream-analytics
55
author: mamccrea
6-
manager: kfile
76
ms.author: mamccrea
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 05/22/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1212
---
1313

1414
# Clean up your Azure Stream Analytics job
@@ -26,7 +26,7 @@ Azure Stream Analytics jobs can be easily deleted through the Azure portal, Azur
2626

2727
3. On the Stream Analytics job page, select **Stop** to stop the job.
2828

29-
![Stop Job](./media/stream-analytics-clean-up-your-job/stop-job.png)
29+
![Stop Azure Stream Analytics job](./media/stream-analytics-clean-up-your-job/stop-stream-analytics-job.png)
3030

3131

3232
## Delete a job in Azure portal
@@ -37,7 +37,7 @@ Azure Stream Analytics jobs can be easily deleted through the Azure portal, Azur
3737

3838
3. On the Stream Analytics job page, select **Delete** to delete the job.
3939

40-
![Delete Job](./media/stream-analytics-clean-up-your-job/delete-job.png)
40+
![Delete Azure Stream Analytics Job](./media/stream-analytics-clean-up-your-job/delete-stream-analytics-job.png)
4141

4242

4343
## Stop or delete a job using PowerShell

‎articles/stream-analytics/stream-analytics-common-troubleshooting-issues.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
2-
title: Common issues to troubleshoot in Azure Stream Analytics
2+
title: Common issues to troubleshoot in Azure Stream Analytics | Microsoft Docs
33
description: This article describes several common issues in Azure Stream Analytics and steps to troubleshoot those issues.
44
services: stream-analytics
5-
author: jasonwhowell
6-
manager: kfile
5+
author: mamccrea
76
ms.author: mamccrea
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 04/12/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1212
---
1313

1414
# Common issues in Stream Analytics and steps to troubleshoot
@@ -19,7 +19,7 @@ ms.date: 04/12/2018
1919

2020
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies user with a warning. A warning symbol is shown on the **Inputs** tile of your Stream Analytics job (This warning sign exists as long as the job is in running state):
2121

22-
![Inputs tile](media/stream-analytics-malformed-events/inputs_tile.png)
22+
![Inputs tile on Azure Stream Analytics dashboard](media/stream-analytics-malformed-events/stream-analytics-inputs-tile.png)
2323

2424
To see more information, enable the diagnostics logs to view the details of the warning. For malformed input events, the execution logs contain an entry with the message that looks like: "Message: Could not deserialize the input event(s) from resource <blob URI> as json".
2525

@@ -29,7 +29,7 @@ To see more information, enable the diagnostics logs to view the details of the
2929

3030
2. The input details tile displays a set of warnings with details about the issue. Following is an example warning message, the warning message shows the Partition, Offset, and sequence numbers where there is malformed JSON data.
3131

32-
![Warning message with offset](media/stream-analytics-malformed-events/warning_message_with_offset.png)
32+
![Input warning message with offset](media/stream-analytics-malformed-events/warning-message-with-offset.png)
3333

3434
3. To get the JSON data that has incorrect format, run the CheckMalformedEvents.cs code. This example is available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID, offset, and prints the data that's located in that offset.
3535

‎articles/stream-analytics/stream-analytics-compatibility-level.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
2-
title: Understand compatibility level for Azure Stream Analytics jobs
2+
title: Understand compatibility level for Azure Stream Analytics jobs | Microsoft Docs
33
description: Learn how to set a compatibility level for an Azure Stream Analytics job and major changes in the latest compatibility level
44
services: stream-analytics
5-
author: jasonwhowell
5+
author: mamccrea
66
ms.author: mamccrea
7-
manager: kfile
87
ms.service: stream-analytics
98
ms.topic: conceptual
10-
ms.date: 10/15/2018
9+
ms.date: 12/06/2018
10+
ms.custom: seodec18
1111
---
1212

1313
# Compatibility level for Azure Stream Analytics jobs
@@ -22,7 +22,7 @@ Compatibility level controls the runtime behavior of a stream analytics job. You
2222

2323
Make sure that you stop the job before updating the compatibility level. You can’t update the compatibility level if your job is in a running state.
2424

25-
![Compatibility level in portal](media/stream-analytics-compatibility-level/image1.png)
25+
![Stream Analytics compatibility level in Azure portal](media/stream-analytics-compatibility-level/stream-analytics-compatibility.png)
2626

2727

2828
When you update the compatibility level, the T-SQL compiler validates the job with the syntax that corresponds to the selected compatibility level.

‎articles/stream-analytics/stream-analytics-concepts-checkpoint-replay.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
2-
title: Checkpoint and replay job recovery concepts in Azure Stream Analytics
2+
title: Checkpoint and replay job recovery concepts in Azure Stream Analytics | Microsoft Docs
33
description: This article describes checkpoint and replay job recovery concepts in Azure Stream Analytics.
44
services: stream-analytics
5-
author: zhongc
6-
ms.author: zhongc
7-
manager: kfile
5+
author: mamccrea
6+
ms.author: mamccrea
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 04/12/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1212
---
1313
# Checkpoint and replay concepts in Azure Stream Analytics jobs
1414
This article describes the internal checkpoint and replay concepts in Azure Stream Analytics, and the impact those have on job recovery. Each time a Stream Analytics job runs, state information is maintained internally. That state information is saved in a checkpoint periodically. In some scenarios, the checkpoint information is used for job recovery if a job failure or upgrade occurs. In other circumstances, the checkpoint cannot be used for recovery, and a replay is necessary.

‎articles/stream-analytics/stream-analytics-custom-path-patterns-blob-storage-output.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,14 @@
11
---
2-
title: Custom DateTime path patterns for Azure Stream Analytics blob storage output (Preview)
3-
description:
2+
title: DateTime path patterns for Azure Stream Analytics blob output (Preview)
3+
description: This article describes the custom DateTime path patterns feature for blob storage output from Azure Stream Analytics jobs.
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
77
ms.reviewer: mamccrea
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 09/24/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1112
---
1213

1314
# Custom DateTime path patterns for Azure Stream Analytics blob storage output (Preview)
@@ -56,7 +57,7 @@ For example, `year={datetime:yyyy}/month={datetime:MM}/day={datetime:dd}/hour={d
5657

5758
Custom output eliminates the hassle of altering tables and manually adding partitions to port data between Azure Stream Analytics and Hive. Instead, many folders can be added automatically using:
5859

59-
```
60+
```SQL
6061
MSCK REPAIR TABLE while hive.exec.dynamic.partition true
6162
```
6263

@@ -70,9 +71,9 @@ Create a blob output sink with the following configuration:
7071

7172
The full path pattern is as follows:
7273

73-
```
74-
year={datetime:yyyy}/month={datetime:MM}/day={datetime:dd}
75-
```
74+
75+
`year={datetime:yyyy}/month={datetime:MM}/day={datetime:dd}`
76+
7677

7778
When you start the job, a folder structure based on the path pattern is created in your blob container. You can drill down to the day level.
7879

‎articles/stream-analytics/stream-analytics-define-inputs.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
2-
title: Stream data as input into Azure Stream Analytics
2+
title: Stream data as input into Azure Stream Analytics | Microsoft Docs
33
description: Learn about setting up a data connection in Azure Stream Analytics. Inputs include a data stream from events, and also reference data.
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
7-
manager: kfile
87
ms.reviewer: jasonh
98
ms.service: stream-analytics
109
ms.topic: conceptual
11-
ms.date: 04/27/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1212
---
1313
# Stream data as input into Stream Analytics
1414

@@ -23,7 +23,7 @@ These input resources can live in the same Azure subscription as your Stream Ana
2323
Stream Analytics supports compression across all data stream input sources. Currently supported reference types are: None, GZip, and Deflate compression. Support for compression is not available for reference data. If the input format is Avro data that is compressed, it's handled transparently. You don't need to specify compression type with Avro serialization.
2424

2525
## Create, edit, or test inputs
26-
You can use the [Azure portal](https://portal.azure.com) to [create new inputs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-quick-create-portal#configure-input-to-the-job) and view or edit existing inputs on your streaming job. You can also test input connections and [test queries](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-manage-job#test-your-query) from sample data. When you write a query, you will list the input in the FROM clause. You can get the list of available inputs from the **Query** page in the portal. If you wish to use multiple inputs, you can `JOIN` them or write multiple `SELECT` queries.
26+
You can use the [Azure portal](https://portal.azure.com) to [create new inputs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-quick-create-portal#configure-job-input) and view or edit existing inputs on your streaming job. You can also test input connections and [test queries](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-manage-job#test-your-query) from sample data. When you write a query, you will list the input in the FROM clause. You can get the list of available inputs from the **Query** page in the portal. If you wish to use multiple inputs, you can `JOIN` them or write multiple `SELECT` queries.
2727

2828

2929
## Stream data from Event Hubs

‎articles/stream-analytics/stream-analytics-define-outputs.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,14 @@
11
---
2-
title: Understand outputs from Azure Stream Analytics
2+
title: Understand outputs from Azure Stream Analytics | Microsoft Docs
33
description: This article describes data output options available in Azure Stream Analytics, including Power BI for analysis results.
44
services: stream-analytics
55
author: mamccrea
66
ms.author: mamccrea
77
ms.reviewer: jasonh
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 11/21/2018
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1112
---
1213

1314
# Understand outputs from Azure Stream Analytics
@@ -29,13 +30,13 @@ Azure Data Lake Store output from Stream Analytics is currently not available in
2930

3031
1. When Data Lake Storage is selected as an output in the Azure portal, you are prompted to authorize a connection to an existing Data Lake Store.
3132

32-
![Authorize Data Lake Store](./media/stream-analytics-define-outputs/06-stream-analytics-define-outputs.png)
33+
![Authorize connection to Data Lake Store](./media/stream-analytics-define-outputs/06-stream-analytics-define-outputs.png)
3334

3435
2. If you already have access to Data Lake Store, select **Authorize Now** and a page pops up indicating **Redirecting to authorization**. After authorization succeeds, you are presented with the page that allows you to configure the Data Lake Store output.
3536

3637
3. Once you have the Data Lake Store account authenticated, you can configure the properties for your Data Lake Store output. The table below is the list of property names and their description to configure your Data Lake Store output.
3738

38-
![Authorize Data Lake Store](./media/stream-analytics-define-outputs/07-stream-analytics-define-outputs.png)
39+
![Define Data Lake Store as Stream Analytics output](./media/stream-analytics-define-outputs/07-stream-analytics-define-outputs.png)
3940

4041
| Property name | Description |
4142
| --- | --- |
@@ -54,7 +55,7 @@ You need to reauthenticate your Data Lake Store account if its password has chan
5455

5556
To renew authorization, **Stop** your job > go to your Data Lake Store output > click the **Renew authorization** link, and for a brief time a page will pop up indicating **Redirecting to authorization...**. The page automatically closes and if successful, indicates **Authorization has been successfully renewed**. You then need to click **Save** at the bottom of the page, and can proceed by restarting your job from the **Last Stopped Time** to avoid data loss.
5657

57-
![Authorize Data Lake Store](./media/stream-analytics-define-outputs/08-stream-analytics-define-outputs.png)
58+
![Renew Data Lake Store authorization in output](./media/stream-analytics-define-outputs/08-stream-analytics-define-outputs.png)
5859

5960
## SQL Database
6061
[Azure SQL Database](https://azure.microsoft.com/services/sql-database/) can be used as an output for data that is relational in nature or for applications that depend on content being hosted in a relational database. Stream Analytics jobs write to an existing table in an Azure SQL Database. The table schema must exactly match the fields and their types being output from your job. An [Azure SQL Data Warehouse](https://azure.microsoft.com/documentation/services/sql-data-warehouse/) can also be specified as an output via the SQL Database output option as well. To learn about ways to improve write throughput, refer to the [Stream Analytics with Azure SQL DB as output](stream-analytics-sql-output-perf.md) article. The table below lists the property names and their description for creating a SQL Database output.
@@ -128,11 +129,11 @@ Power BI output from Stream Analytics is currently not available in the Azure Ch
128129
### Authorize a Power BI account
129130
1. When Power BI is selected as an output in the Azure portal, you are prompted to authorize an existing Power BI User or to create a new Power BI account.
130131

131-
![Authorize Power BI User](./media/stream-analytics-define-outputs/01-stream-analytics-define-outputs.png)
132+
![Authorize Power BI user to configure output](./media/stream-analytics-define-outputs/01-stream-analytics-define-outputs.png)
132133

133134
2. Create a new account if you don’t yet have one, then click Authorize Now. The following page is shown:
134135

135-
![Azure Account Power BI](./media/stream-analytics-define-outputs/02-stream-analytics-define-outputs.png)
136+
![Authenticate to Power BI from Azure Account](./media/stream-analytics-define-outputs/02-stream-analytics-define-outputs.png)
136137

137138
3. In this step, provide the work or school account for authorizing the Power BI output. If you are not already signed up for Power BI, choose Sign up now. The work or school account you use for Power BI could be different from the Azure subscription account, which you are currently logged in with.
138139

@@ -185,11 +186,11 @@ Datetime | String | String | Datetime | String
185186
### Renew Power BI Authorization
186187
If your Power BI account password changes after your Stream Analytics job was created or last authenticated, you need to reauthenticate the Stream Analytics. If Multi-Factor Authentication (MFA) is configured on your Azure Active Directory (AAD) tenant, you also need to renew Power BI authorization every two weeks. A symptom of this issue is no job output and an "Authenticate user error" in the Operation Logs:
187188

188-
![Power BI refresh token error](./media/stream-analytics-define-outputs/03-stream-analytics-define-outputs.png)
189+
![Power BI authenticate user error](./media/stream-analytics-define-outputs/03-stream-analytics-define-outputs.png)
189190

190191
To resolve this issue, stop your running job and go to your Power BI output. Select the **Renew authorization** link, and restart your job from the **Last Stopped Time** to avoid data loss.
191192

192-
![Power BI renews authorization](./media/stream-analytics-define-outputs/04-stream-analytics-define-outputs.png)
193+
![Renew Power BI authorization for output](./media/stream-analytics-define-outputs/04-stream-analytics-define-outputs.png)
193194

194195
## Table Storage
195196
[Azure Table storage](../storage/common/storage-introduction.md) offers highly available, massively scalable storage, so that an application can automatically scale to meet user demand. Table storage is Microsoft’s NoSQL key/attribute store, which one can leverage for structured data with fewer constraints on the schema. Azure Table storage can be used to store data for persistence and efficient retrieval.

‎articles/stream-analytics/stream-analytics-documentdb-output.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,14 @@
11
---
2-
title: Azure Stream Analytics output to Cosmos DB
2+
title: Azure Stream Analytics output to Cosmos DB | Microsoft Docs
33
description: This article describes how to use Azure Stream Analytics to save output to Azure Cosmos DB for JSON output, for data archiving and low-latency queries on unstructured JSON data.
44
services: stream-analytics
5-
author: jseb225
6-
ms.author: jeanb
5+
author: mamccrea
6+
ms.author: mamccrea
77
ms.reviewer: mamccrea
88
ms.service: stream-analytics
99
ms.topic: conceptual
10-
ms.date: 11/21/2017
10+
ms.date: 12/06/2018
11+
ms.custom: seodec18
1112
---
1213
# Azure Stream Analytics output to Azure Cosmos DB
1314
Stream Analytics can target [Azure Cosmos DB](https://azure.microsoft.com/services/documentdb/) for JSON output, enabling data archiving and low-latency queries on unstructured JSON data. This document covers some best practices for implementing this configuration.

0 commit comments

Comments
 (0)
Please sign in to comment.