You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/capture-event-hub-data-parquet.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.topic: how-to
8
8
ms.custom: mvc
9
9
ms.date: 05/08/2022
10
10
---
11
-
# Capture data from Event Hub in Parquet format
11
+
# Capture data from Event Hubs in Parquet format
12
12
13
13
This article explains how to use the no code editor to automatically capture streaming data in Event Hubs in an Azure Data Lake Storage Gen2 account in Parquet format. You have the flexibility of specifying a time or size interval.
14
14
@@ -44,7 +44,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
44
44
1. Choose the output start time.
45
45
1. Select the number of Streaming Units (SU) that the job runs with. SU represents the computing resources that are allocated to execute a Stream Analytics job. For more information, see [Streaming Units in Azure Stream Analytics](stream-analytics-streaming-unit-consumption.md).
46
46
1. In the **Choose Output data error handling** list, select the behavior you want when the output of the job fails due to data error. Select **Retry** to have the job retry until it writes successfully or select another option.
:::image type="content" source="./media/capture-event-hub-data-parquet/start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you set the output start time, streaming units, and error handling." lightbox="./media/capture-event-hub-data-parquet/start-job.png" :::
48
48
49
49
The new job is shown on the **Stream Analytics jobs** tab. Select **Open metrics** to monitor it.
Copy file name to clipboardExpand all lines: articles/stream-analytics/filter-ingest-data-lake-storage-gen2.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ This article describes how you can use the no code editor to easily create a Str
26
26
1. Enter a name for the Stream Analytics job, then select **Create**.
27
27
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/create-job.png" alt-text="Screenshot showing where to enter a job name." lightbox="./media/filter-ingest-data-lake-storage-gen2/create-job.png" :::
28
28
1. Specify the **Serialization** type of your data in the Event Hubs window and the **Authentication method** that the job will use to connect to the Event Hubs. Then select **Connect**.
29
-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" alt-text="Screenshot showing the Event Hub area where you select Serialization and Authentication method." lightbox="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" :::
29
+
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" alt-text="Screenshot showing the Event Hubs area where you select Serialization and Authentication method." lightbox="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" :::
30
30
1. If the connection is established successfully and you have data streams flowing in to the Event Hubs instance, you'll immediately see two things:
31
31
1. Fields that are present in the input data. You can choose **Add field** or select the three dot symbol next to each field to remove, rename, or change its type.
32
32
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/add-field.png" alt-text="Screenshot showing where you can add a field or remove, rename, or change a field type." lightbox="./media/filter-ingest-data-lake-storage-gen2/add-field.png" :::
@@ -45,11 +45,11 @@ This article describes how you can use the no code editor to easily create a Str
45
45
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-save-start.png" alt-text="Screenshot showing the job Save and Start options." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-save-start.png" :::
46
46
1. To start the job, specify the number of **Streaming Units (SUs)** that the job runs with. SUs represents the amount of compute and memory allocated to the job. We recommended that you start with three and then adjust as needed.
47
47
1. After your select **Start**, the job starts running within two minutes.
48
-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you select Start." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" :::
You can see the job under the Process Data section in the **Stream Analytics jobs** tab. Select **Open metrics** to monitor it or stop and restart it, as needed.
51
51
52
-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" alt-text="Screenshot showing the Stream Analytics jobs tab where you can view the the current job status." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" :::
52
+
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" alt-text="Screenshot showing the Stream Analytics jobs tab." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" :::
Copy file name to clipboardExpand all lines: articles/stream-analytics/no-code-materialize-cosmos-db.md
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -23,19 +23,19 @@ Use the following steps to develop a Stream Analytics job to materialize data in
23
23
24
24
1. In the Azure portal, locate and select your Azure Event Hubs instance.
25
25
2. Under **Features**, select **Process Data**. Then, select **Start** in the card titled **Materialize Data in Cosmos DB**.
26
-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-materialize-view-start.png" alt-text="Screenshot showing the Start Materialize Data Flow." lightbox="./media/no-code-materialize-cosmosdb/no-code-materialize-view-start.png" :::
26
+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-materialize-view-start.png" alt-text="Screenshot showing the Start Materialize Data Flow." lightbox="./media/no-code-materialize-cosmos-db/no-code-materialize-view-start.png" :::
27
27
3. Enter a name for your job and select **Create**.
28
28
4. Specify the **Serialization** type of your data in the event hub and the **Authentication method** that the job will use to connect to the Event Hubs. Then select **Connect**.
29
29
5. If the connection is successful and you have data streams flowing into your Event Hubs instance, you'll immediately see two things:
30
30
- Fields that are present in your input payload. Select the three dot symbol next to a field optionally remove, rename, or change the data type of the field.
31
-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-schema.png" alt-text="Screenshot showing the event hub fields of input for you to review." lightbox="./media/no-code-materialize-cosmosdb/no-code-schema.png" :::
31
+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-schema.png" alt-text="Screenshot showing the event hub fields of input for you to review." lightbox="./media/no-code-materialize-cosmos-db/no-code-schema.png" :::
32
32
- A sample of your input data in the bottom pane under **Data preview** that automatically refreshes periodically. You can select **Pause streaming preview** if you prefer to have a static view of your sample input data.
6. In the next step, you specify the field and the **aggregate** you want to calculate, such as Average and Count. You can also specify the field that you want to **Group By** along with the **time window**. Then you can validate the results of the step in the **Data preview** section.
35
-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-group-by.png" alt-text="Screenshot showing the Group By area." lightbox="./media/no-code-materialize-cosmosdb/no-code-group-by.png" :::
35
+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-group-by.png" alt-text="Screenshot showing the Group By area." lightbox="./media/no-code-materialize-cosmos-db/no-code-group-by.png" :::
36
36
7. Choose the **Cosmos DB database** and **container** where you want results written.
37
37
8. Start the Stream Analytics job by selecting **Start**.
38
-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-cosmosdb-start.png" alt-text="Screenshot showing your definition where you select Start." lightbox="./media/no-code-materialize-cosmosdb/no-code-cosmosdb-start.png" :::
38
+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-cosmos-db-start.png" alt-text="Screenshot showing your definition where you select Start." lightbox="./media/no-code-materialize-cosmos-db/no-code-cosmos-db-start.png" :::
39
39
To start the job, you must specify:
40
40
- The number of **Streaming Units (SU)** the job runs with. SUs represent the amount of compute and memory allocated to the job. We recommended that you start with three and adjust as needed.
41
41
- **Output data error handling** allows you to specify the behavior you want when a job’s output to your destination fails due to data errors. By default, your job retries until the write operation succeeds. You can also choose to drop output events.
Copy file name to clipboardExpand all lines: articles/stream-analytics/no-code-stream-processing.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -62,7 +62,7 @@ After you set up your Event Hubs credentials and select **Connect**, you can add
62
62
63
63
You can always edit the field names, or remove or change the data type, by selecting the three dot symbol next to each field. You can also expand, select, and edit any nested fields from the incoming messages, as shown in the following image.
:::image type="content" source="./media/no-code-stream-processing/event-hub-schema.png" alt-text="Screenshot showing Event Hub fields where you add, remove, and edit the fields." lightbox="./media/no-code-stream-processing/event-hub-schema.png" :::
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-autoscale.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -118,8 +118,8 @@ The previous section shows you how to add a default condition for the autoscale
118
118
5. Select **+ Add a rule** to add a rule to increase streaming units when the overall SU % utilization goes above 75%. Follow steps from the preceding **Default condition** section.
119
119
6. Set the **minimum** and **maximum** and **default** number of streaming units.
120
120
7. Set **Schedule**, **Timezone**, **Start date**, and **End date** on the custom condition (but not on the default condition). You can either specify start and end dates for the condition (or) select **Repeat specific days** (Monday, Tuesday, and so on.) of a week.
121
-
1. If you select **Specify start/end dates**, select the **Timezone**, **Start date and time**, and **End date and time** for the condition to be in effect.
122
-
2. If you select **Repeat specific days**, select the days of the week, timezone, start time, and end time when the condition should apply.
121
+
- If you select **Specify start/end dates**, select the **Timezone**, **Start date and time**, and **End date and time** for the condition to be in effect.
122
+
- If you select **Repeat specific days**, select the days of the week, timezone, start time, and end time when the condition should apply.
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-concepts-checkpoint-replay.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,7 @@ In general, the amount of replay needed is proportional to the size of the windo
46
46
## Estimate replay catch-up time
47
47
To estimate the length of the delay due to a service upgrade, you can follow this technique:
48
48
49
-
1. Load the input Event Hub with sufficient data to cover the largest window size in your query, at expected event rate. The events’ timestamp should be close to the wall clock time throughout that period of time, as if it’s a live input feed. For example, if you have a 3-day window in your query, send events to Event Hub for three days, and continue to send events.
49
+
1. Load the input Event Hubs with sufficient data to cover the largest window size in your query, at expected event rate. The events’ timestamp should be close to the wall clock time throughout that period of time, as if it’s a live input feed. For example, if you have a 3-day window in your query, send events to Event Hubs for three days, and continue to send events.
0 commit comments