Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 955646e

Browse files
committedMay 12, 2022
updated per PR reviewer feedback
1 parent f243ee1 commit 955646e

16 files changed

+18
-18
lines changed
 

‎articles/event-hubs/TOC.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -193,10 +193,10 @@
193193
href: event-hubs-programming-guide.md
194194
- name: Process data
195195
items:
196-
- name: Capture Event Hub data in Parquet format
196+
- name: Capture Event Hubs data in Parquet format
197197
href: ../stream-analytics/capture-event-hub-data-parquet.md?toc=%2fazure%2fevent-hubs%2ftoc.json
198198
- name: Materialize data to Cosmos DB
199-
href: ../stream-analytics/no-code-materialize-cosmosdb.md?toc=%2fazure%2fevent-hubs%2ftoc.json
199+
href: ../stream-analytics/no-code-materialize-cosmos-db.md?toc=%2fazure%2fevent-hubs%2ftoc.json
200200
- name: Filter and ingest Synapse SQL data
201201
href: ../stream-analytics/filter-ingest-synapse-sql.md?toc=%2fazure%2fevent-hubs%2ftoc.json
202202
- name: Filter and ingest to Data Lake Storage Gen2

‎articles/stream-analytics/TOC.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -202,10 +202,10 @@
202202
href: geo-redundancy.md
203203
- name: Build with no code editor
204204
items:
205-
- name: Capture Event Hub data in Parquet format
205+
- name: Capture Event Hubs data in Parquet format
206206
href: capture-event-hub-data-parquet.md
207207
- name: Materialize data to Cosmos DB
208-
href: no-code-materialize-cosmosdb.md
208+
href: no-code-materialize-cosmos-db.md
209209
- name: Filter and ingest Synapse SQL data
210210
href: filter-ingest-synapse-sql.md
211211
- name: Filter and ingest to Data Lake Storage Gen2

‎articles/stream-analytics/capture-event-hub-data-parquet.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.topic: how-to
88
ms.custom: mvc
99
ms.date: 05/08/2022
1010
---
11-
# Capture data from Event Hub in Parquet format
11+
# Capture data from Event Hubs in Parquet format
1212

1313
This article explains how to use the no code editor to automatically capture streaming data in Event Hubs in an Azure Data Lake Storage Gen2 account in Parquet format. You have the flexibility of specifying a time or size interval.
1414

@@ -44,7 +44,7 @@ Use the following steps to configure a Stream Analytics job to capture data in A
4444
1. Choose the output start time.
4545
1. Select the number of Streaming Units (SU) that the job runs with. SU represents the computing resources that are allocated to execute a Stream Analytics job. For more information, see [Streaming Units in Azure Stream Analytics](stream-analytics-streaming-unit-consumption.md).
4646
1. In the **Choose Output data error handling** list, select the behavior you want when the output of the job fails due to data error. Select **Retry** to have the job retry until it writes successfully or select another option.
47-
:::image type="content" source="./media/capture-event-hub-data-parquet/start-job.png" alt-text="ALTTEXT" lightbox="./media/capture-event-hub-data-parquet/start-job.png" :::
47+
:::image type="content" source="./media/capture-event-hub-data-parquet/start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you set the output start time, streaming units, and error handling." lightbox="./media/capture-event-hub-data-parquet/start-job.png" :::
4848

4949
The new job is shown on the **Stream Analytics jobs** tab. Select **Open metrics** to monitor it.
5050

‎articles/stream-analytics/filter-ingest-data-lake-storage-gen2.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ This article describes how you can use the no code editor to easily create a Str
2626
1. Enter a name for the Stream Analytics job, then select **Create**.
2727
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/create-job.png" alt-text="Screenshot showing where to enter a job name." lightbox="./media/filter-ingest-data-lake-storage-gen2/create-job.png" :::
2828
1. Specify the **Serialization** type of your data in the Event Hubs window and the **Authentication method** that the job will use to connect to the Event Hubs. Then select **Connect**.
29-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" alt-text="Screenshot showing the Event Hub area where you select Serialization and Authentication method." lightbox="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" :::
29+
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" alt-text="Screenshot showing the Event Hubs area where you select Serialization and Authentication method." lightbox="./media/filter-ingest-data-lake-storage-gen2/event-hub-review-connect.png" :::
3030
1. If the connection is established successfully and you have data streams flowing in to the Event Hubs instance, you'll immediately see two things:
3131
1. Fields that are present in the input data. You can choose **Add field** or select the three dot symbol next to each field to remove, rename, or change its type.
3232
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/add-field.png" alt-text="Screenshot showing where you can add a field or remove, rename, or change a field type." lightbox="./media/filter-ingest-data-lake-storage-gen2/add-field.png" :::
@@ -45,11 +45,11 @@ This article describes how you can use the no code editor to easily create a Str
4545
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-save-start.png" alt-text="Screenshot showing the job Save and Start options." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-save-start.png" :::
4646
1. To start the job, specify the number of **Streaming Units (SUs)** that the job runs with. SUs represents the amount of compute and memory allocated to the job. We recommended that you start with three and then adjust as needed.
4747
1. After your select **Start**, the job starts running within two minutes.
48-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window where you select Start." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" :::
48+
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" alt-text="Screenshot showing the Start Stream Analytics job window." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-start-job.png" :::
4949

5050
You can see the job under the Process Data section in the **Stream Analytics jobs** tab. Select **Open metrics** to monitor it or stop and restart it, as needed.
5151

52-
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" alt-text="Screenshot showing the Stream Analytics jobs tab where you can view the the current job status." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" :::
52+
:::image type="content" source="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" alt-text="Screenshot showing the Stream Analytics jobs tab." lightbox="./media/filter-ingest-data-lake-storage-gen2/no-code-list-jobs.png" :::
5353

5454
## Next steps
5555

‎articles/stream-analytics/no-code-materialize-cosmosdb.md renamed to ‎articles/stream-analytics/no-code-materialize-cosmos-db.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -23,19 +23,19 @@ Use the following steps to develop a Stream Analytics job to materialize data in
2323

2424
1. In the Azure portal, locate and select your Azure Event Hubs instance.
2525
2. Under **Features**, select **Process Data**. Then, select **Start** in the card titled **Materialize Data in Cosmos DB**.
26-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-materialize-view-start.png" alt-text="Screenshot showing the Start Materialize Data Flow." lightbox="./media/no-code-materialize-cosmosdb/no-code-materialize-view-start.png" :::
26+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-materialize-view-start.png" alt-text="Screenshot showing the Start Materialize Data Flow." lightbox="./media/no-code-materialize-cosmos-db/no-code-materialize-view-start.png" :::
2727
3. Enter a name for your job and select **Create**.
2828
4. Specify the **Serialization** type of your data in the event hub and the **Authentication method** that the job will use to connect to the Event Hubs. Then select **Connect**.
2929
5. If the connection is successful and you have data streams flowing into your Event Hubs instance, you'll immediately see two things:
3030
- Fields that are present in your input payload. Select the three dot symbol next to a field optionally remove, rename, or change the data type of the field.
31-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-schema.png" alt-text="Screenshot showing the event hub fields of input for you to review." lightbox="./media/no-code-materialize-cosmosdb/no-code-schema.png" :::
31+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-schema.png" alt-text="Screenshot showing the event hub fields of input for you to review." lightbox="./media/no-code-materialize-cosmos-db/no-code-schema.png" :::
3232
- A sample of your input data in the bottom pane under **Data preview** that automatically refreshes periodically. You can select **Pause streaming preview** if you prefer to have a static view of your sample input data.
33-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-sample-input.png" alt-text="Screenshot showing sample input data." lightbox="./media/no-code-materialize-cosmosdb/no-code-sample-input.png" :::
33+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-sample-input.png" alt-text="Screenshot showing sample input data." lightbox="./media/no-code-materialize-cosmos-db/no-code-sample-input.png" :::
3434
6. In the next step, you specify the field and the **aggregate** you want to calculate, such as Average and Count. You can also specify the field that you want to **Group By** along with the **time window**. Then you can validate the results of the step in the **Data preview** section.
35-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-group-by.png" alt-text="Screenshot showing the Group By area." lightbox="./media/no-code-materialize-cosmosdb/no-code-group-by.png" :::
35+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-group-by.png" alt-text="Screenshot showing the Group By area." lightbox="./media/no-code-materialize-cosmos-db/no-code-group-by.png" :::
3636
7. Choose the **Cosmos DB database** and **container** where you want results written.
3737
8. Start the Stream Analytics job by selecting **Start**.
38-
:::image type="content" source="./media/no-code-materialize-cosmosdb/no-code-cosmosdb-start.png" alt-text="Screenshot showing your definition where you select Start." lightbox="./media/no-code-materialize-cosmosdb/no-code-cosmosdb-start.png" :::
38+
:::image type="content" source="./media/no-code-materialize-cosmos-db/no-code-cosmos-db-start.png" alt-text="Screenshot showing your definition where you select Start." lightbox="./media/no-code-materialize-cosmos-db/no-code-cosmos-db-start.png" :::
3939
To start the job, you must specify:
4040
- The number of **Streaming Units (SU)** the job runs with. SUs represent the amount of compute and memory allocated to the job. We recommended that you start with three and adjust as needed.
4141
- **Output data error handling** allows you to specify the behavior you want when a job’s output to your destination fails due to data errors. By default, your job retries until the write operation succeeds. You can also choose to drop output events.

‎articles/stream-analytics/no-code-stream-processing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ After you set up your Event Hubs credentials and select **Connect**, you can add
6262

6363
You can always edit the field names, or remove or change the data type, by selecting the three dot symbol next to each field. You can also expand, select, and edit any nested fields from the incoming messages, as shown in the following image.
6464

65-
:::image type="content" source="./media/no-code-stream-processing/event-hub-schema.png" alt-text="Screenshot showing Event Hub fields" lightbox="./media/no-code-stream-processing/event-hub-schema.png" :::
65+
:::image type="content" source="./media/no-code-stream-processing/event-hub-schema.png" alt-text="Screenshot showing Event Hub fields where you add, remove, and edit the fields." lightbox="./media/no-code-stream-processing/event-hub-schema.png" :::
6666

6767
The available data types are:
6868

‎articles/stream-analytics/stream-analytics-autoscale.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -118,8 +118,8 @@ The previous section shows you how to add a default condition for the autoscale
118118
5. Select **+ Add a rule** to add a rule to increase streaming units when the overall SU % utilization goes above 75%. Follow steps from the preceding **Default condition** section.
119119
6. Set the **minimum** and **maximum** and **default** number of streaming units.
120120
7. Set **Schedule**, **Timezone**, **Start date**, and **End date** on the custom condition (but not on the default condition). You can either specify start and end dates for the condition (or) select **Repeat specific days** (Monday, Tuesday, and so on.) of a week.
121-
1. If you select **Specify start/end dates**, select the **Timezone**, **Start date and time**, and **End date and time** for the condition to be in effect.
122-
2. If you select **Repeat specific days**, select the days of the week, timezone, start time, and end time when the condition should apply.
121+
- If you select **Specify start/end dates**, select the **Timezone**, **Start date and time**, and **End date and time** for the condition to be in effect.
122+
- If you select **Repeat specific days**, select the days of the week, timezone, start time, and end time when the condition should apply.
123123

124124
### Scale to specific number of streaming units
125125

‎articles/stream-analytics/stream-analytics-concepts-checkpoint-replay.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ In general, the amount of replay needed is proportional to the size of the windo
4646
## Estimate replay catch-up time
4747
To estimate the length of the delay due to a service upgrade, you can follow this technique:
4848

49-
1. Load the input Event Hub with sufficient data to cover the largest window size in your query, at expected event rate. The events’ timestamp should be close to the wall clock time throughout that period of time, as if it’s a live input feed. For example, if you have a 3-day window in your query, send events to Event Hub for three days, and continue to send events.
49+
1. Load the input Event Hubs with sufficient data to cover the largest window size in your query, at expected event rate. The events’ timestamp should be close to the wall clock time throughout that period of time, as if it’s a live input feed. For example, if you have a 3-day window in your query, send events to Event Hubs for three days, and continue to send events.
5050

5151
2. Start the job using **Now** as the start time.
5252

0 commit comments

Comments
 (0)
Please sign in to comment.