You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-common-troubleshooting-issues.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
---
2
-
title: Common issues to troubleshoot in Azure Stream Analytics
2
+
title: Common issues to troubleshoot in Azure Stream Analytics | Microsoft Docs
3
3
description: This article describes several common issues in Azure Stream Analytics and steps to troubleshoot those issues.
4
4
services: stream-analytics
5
-
author: jasonwhowell
6
-
manager: kfile
5
+
author: mamccrea
7
6
ms.author: mamccrea
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 04/12/2018
10
+
ms.date: 12/06/2018
11
+
ms.custom: seodec18
12
12
---
13
13
14
14
# Common issues in Stream Analytics and steps to troubleshoot
@@ -19,7 +19,7 @@ ms.date: 04/12/2018
19
19
20
20
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies user with a warning. A warning symbol is shown on the **Inputs** tile of your Stream Analytics job (This warning sign exists as long as the job is in running state):

23
23
24
24
To see more information, enable the diagnostics logs to view the details of the warning. For malformed input events, the execution logs contain an entry with the message that looks like: "Message: Could not deserialize the input event(s) from resource <blobURI> as json".
25
25
@@ -29,7 +29,7 @@ To see more information, enable the diagnostics logs to view the details of the
29
29
30
30
2. The input details tile displays a set of warnings with details about the issue. Following is an example warning message, the warning message shows the Partition, Offset, and sequence numbers where there is malformed JSON data.
31
31
32
-

32
+

33
33
34
34
3. To get the JSON data that has incorrect format, run the CheckMalformedEvents.cs code. This example is available in the [GitHub samples repository](https://github.com/Azure/azure-stream-analytics/tree/master/Samples/CheckMalformedEventsEH). This code reads the partition ID, offset, and prints the data that's located in that offset.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-concepts-checkpoint-replay.md
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
---
2
-
title: Checkpoint and replay job recovery concepts in Azure Stream Analytics
2
+
title: Checkpoint and replay job recovery concepts in Azure Stream Analytics | Microsoft Docs
3
3
description: This article describes checkpoint and replay job recovery concepts in Azure Stream Analytics.
4
4
services: stream-analytics
5
-
author: zhongc
6
-
ms.author: zhongc
7
-
manager: kfile
5
+
author: mamccrea
6
+
ms.author: mamccrea
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 04/12/2018
10
+
ms.date: 12/06/2018
11
+
ms.custom: seodec18
12
12
---
13
13
# Checkpoint and replay concepts in Azure Stream Analytics jobs
14
14
This article describes the internal checkpoint and replay concepts in Azure Stream Analytics, and the impact those have on job recovery. Each time a Stream Analytics job runs, state information is maintained internally. That state information is saved in a checkpoint periodically. In some scenarios, the checkpoint information is used for job recovery if a job failure or upgrade occurs. In other circumstances, the checkpoint cannot be used for recovery, and a replay is necessary.
@@ -56,7 +57,7 @@ For example, `year={datetime:yyyy}/month={datetime:MM}/day={datetime:dd}/hour={d
56
57
57
58
Custom output eliminates the hassle of altering tables and manually adding partitions to port data between Azure Stream Analytics and Hive. Instead, many folders can be added automatically using:
58
59
59
-
```
60
+
```SQL
60
61
MSCK REPAIR TABLE while hive.exec.dynamic.partition true
61
62
```
62
63
@@ -70,9 +71,9 @@ Create a blob output sink with the following configuration:
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-define-inputs.md
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
---
2
-
title: Stream data as input into Azure Stream Analytics
2
+
title: Stream data as input into Azure Stream Analytics | Microsoft Docs
3
3
description: Learn about setting up a data connection in Azure Stream Analytics. Inputs include a data stream from events, and also reference data.
4
4
services: stream-analytics
5
5
author: mamccrea
6
6
ms.author: mamccrea
7
-
manager: kfile
8
7
ms.reviewer: jasonh
9
8
ms.service: stream-analytics
10
9
ms.topic: conceptual
11
-
ms.date: 04/27/2018
10
+
ms.date: 12/06/2018
11
+
ms.custom: seodec18
12
12
---
13
13
# Stream data as input into Stream Analytics
14
14
@@ -23,7 +23,7 @@ These input resources can live in the same Azure subscription as your Stream Ana
23
23
Stream Analytics supports compression across all data stream input sources. Currently supported reference types are: None, GZip, and Deflate compression. Support for compression is not available for reference data. If the input format is Avro data that is compressed, it's handled transparently. You don't need to specify compression type with Avro serialization.
24
24
25
25
## Create, edit, or test inputs
26
-
You can use the [Azure portal](https://portal.azure.com) to [create new inputs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-quick-create-portal#configure-input-to-the-job) and view or edit existing inputs on your streaming job. You can also test input connections and [test queries](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-manage-job#test-your-query) from sample data. When you write a query, you will list the input in the FROM clause. You can get the list of available inputs from the **Query** page in the portal. If you wish to use multiple inputs, you can `JOIN` them or write multiple `SELECT` queries.
26
+
You can use the [Azure portal](https://portal.azure.com) to [create new inputs](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-quick-create-portal#configure-job-input) and view or edit existing inputs on your streaming job. You can also test input connections and [test queries](https://docs.microsoft.com/azure/stream-analytics/stream-analytics-manage-job#test-your-query) from sample data. When you write a query, you will list the input in the FROM clause. You can get the list of available inputs from the **Query** page in the portal. If you wish to use multiple inputs, you can `JOIN` them or write multiple `SELECT` queries.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-define-outputs.md
+10-9Lines changed: 10 additions & 9 deletions
Original file line number
Diff line number
Diff line change
@@ -1,13 +1,14 @@
1
1
---
2
-
title: Understand outputs from Azure Stream Analytics
2
+
title: Understand outputs from Azure Stream Analytics | Microsoft Docs
3
3
description: This article describes data output options available in Azure Stream Analytics, including Power BI for analysis results.
4
4
services: stream-analytics
5
5
author: mamccrea
6
6
ms.author: mamccrea
7
7
ms.reviewer: jasonh
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 11/21/2018
10
+
ms.date: 12/06/2018
11
+
ms.custom: seodec18
11
12
---
12
13
13
14
# Understand outputs from Azure Stream Analytics
@@ -29,13 +30,13 @@ Azure Data Lake Store output from Stream Analytics is currently not available in
29
30
30
31
1. When Data Lake Storage is selected as an output in the Azure portal, you are prompted to authorize a connection to an existing Data Lake Store.
31
32
32
-

33
+

33
34
34
35
2. If you already have access to Data Lake Store, select **Authorize Now** and a page pops up indicating **Redirecting to authorization**. After authorization succeeds, you are presented with the page that allows you to configure the Data Lake Store output.
35
36
36
37
3. Once you have the Data Lake Store account authenticated, you can configure the properties for your Data Lake Store output. The table below is the list of property names and their description to configure your Data Lake Store output.
37
38
38
-

39
+

39
40
40
41
| Property name | Description |
41
42
| --- | --- |
@@ -54,7 +55,7 @@ You need to reauthenticate your Data Lake Store account if its password has chan
54
55
55
56
To renew authorization, **Stop** your job > go to your Data Lake Store output > click the **Renew authorization** link, and for a brief time a page will pop up indicating **Redirecting to authorization...**. The page automatically closes and if successful, indicates **Authorization has been successfully renewed**. You then need to click **Save** at the bottom of the page, and can proceed by restarting your job from the **Last Stopped Time** to avoid data loss.
56
57
57
-

58
+

58
59
59
60
## SQL Database
60
61
[Azure SQL Database](https://azure.microsoft.com/services/sql-database/) can be used as an output for data that is relational in nature or for applications that depend on content being hosted in a relational database. Stream Analytics jobs write to an existing table in an Azure SQL Database. The table schema must exactly match the fields and their types being output from your job. An [Azure SQL Data Warehouse](https://azure.microsoft.com/documentation/services/sql-data-warehouse/) can also be specified as an output via the SQL Database output option as well. To learn about ways to improve write throughput, refer to the [Stream Analytics with Azure SQL DB as output](stream-analytics-sql-output-perf.md) article. The table below lists the property names and their description for creating a SQL Database output.
@@ -128,11 +129,11 @@ Power BI output from Stream Analytics is currently not available in the Azure Ch
128
129
### Authorize a Power BI account
129
130
1. When Power BI is selected as an output in the Azure portal, you are prompted to authorize an existing Power BI User or to create a new Power BI account.
130
131
131
-

132
+

132
133
133
134
2. Create a new account if you don’t yet have one, then click Authorize Now. The following page is shown:
134
135
135
-

136
+

136
137
137
138
3. In this step, provide the work or school account for authorizing the Power BI output. If you are not already signed up for Power BI, choose Sign up now. The work or school account you use for Power BI could be different from the Azure subscription account, which you are currently logged in with.
If your Power BI account password changes after your Stream Analytics job was created or last authenticated, you need to reauthenticate the Stream Analytics. If Multi-Factor Authentication (MFA) is configured on your Azure Active Directory (AAD) tenant, you also need to renew Power BI authorization every two weeks. A symptom of this issue is no job output and an "Authenticate user error" in the Operation Logs:
187
188
188
-

189
+

189
190
190
191
To resolve this issue, stop your running job and go to your Power BI output. Select the **Renew authorization** link, and restart your job from the **Last Stopped Time** to avoid data loss.
191
192
192
-

193
+

193
194
194
195
## Table Storage
195
196
[Azure Table storage](../storage/common/storage-introduction.md) offers highly available, massively scalable storage, so that an application can automatically scale to meet user demand. Table storage is Microsoft’s NoSQL key/attribute store, which one can leverage for structured data with fewer constraints on the schema. Azure Table storage can be used to store data for persistence and efficient retrieval.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-documentdb-output.md
+5-4Lines changed: 5 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -1,13 +1,14 @@
1
1
---
2
-
title: Azure Stream Analytics output to Cosmos DB
2
+
title: Azure Stream Analytics output to Cosmos DB | Microsoft Docs
3
3
description: This article describes how to use Azure Stream Analytics to save output to Azure Cosmos DB for JSON output, for data archiving and low-latency queries on unstructured JSON data.
4
4
services: stream-analytics
5
-
author: jseb225
6
-
ms.author: jeanb
5
+
author: mamccrea
6
+
ms.author: mamccrea
7
7
ms.reviewer: mamccrea
8
8
ms.service: stream-analytics
9
9
ms.topic: conceptual
10
-
ms.date: 11/21/2017
10
+
ms.date: 12/06/2018
11
+
ms.custom: seodec18
11
12
---
12
13
# Azure Stream Analytics output to Azure Cosmos DB
13
14
Stream Analytics can target [Azure Cosmos DB](https://azure.microsoft.com/services/documentdb/) for JSON output, enabling data archiving and low-latency queries on unstructured JSON data. This document covers some best practices for implementing this configuration.
0 commit comments