Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 56ae997

Browse files
committedApr 27, 2018
New consumer group screenshot
1 parent c1b8aa4 commit 56ae997

File tree

2 files changed

+17
-4
lines changed

2 files changed

+17
-4
lines changed
 

‎articles/stream-analytics/stream-analytics-event-hub-consumer-groups.md

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,23 +15,36 @@ ms.date: 04/27/2018
1515

1616
You can use Azure Event Hubs in Azure Stream Analytics to ingest or output data from a job. A best practice for using Event Hubs is to use multiple consumer groups, to ensure job scalability. One reason is that the number of readers in the Stream Analytics job for a specific input affects the number of readers in a single consumer group. The precise number of receivers is based on internal implementation details for the scale-out topology logic. The number of receivers is not exposed externally. The number of readers can change either at the job start time or during job upgrades.
1717

18+
The error shown when number of receivers exceeds the maximum is:
19+
`The streaming job failed: Stream Analytics job has validation errors: Job will exceed the maximum amount of Event Hub Receivers.`
20+
1821
> [!NOTE]
1922
> When the number of readers changes during a job upgrade, transient warnings are written to audit logs. Stream Analytics jobs automatically recover from these transient issues.
2023
2124
## Add a consumer group in Event Hubs
22-
1. Sign the Azure portal
23-
2. Locate your Event Hubs
25+
To add a new consumer group in your Event Hubs instance, follow these steps:
26+
27+
1. Sign in to the Azure portal.
28+
29+
2. Locate your Event Hubs.
30+
2431
3. Select **Event Hubs** under the **Entities** heading.
32+
2533
4. Select the Event Hub by name.
34+
2635
5. On the **Event Hubs Instance** page, under the **Entities** heading, select **Consumer groups**. A consumer group with name **$Default** is listed.
36+
2737
6. Select **+ Consumer Group** to add a new consumer group.
2838

29-
When you hook up your input in the Stream Analytics job to point to the Event Hub, you specify the consumer group there. $Default is used when none is specified.
39+
![Add a consumer group in Event Hubs](media/stream-analytics-event-hub-consumer-groups/new-eh-consumer-group.png)
40+
41+
7. When you created the input in the Stream Analytics job to point to the Event Hub, you specified the consumer group there. $Default is used when none is specified. Once you create a new consumer group, edit the Event Hub input in the Stream Analytics job and specify the name of the new consumer group.
3042

31-
If your streaming query syntax references the same input Event Hub resource multiple times, it can use multiple readers per query from that same consumer group. If there are too many references to the same consumer group, the consumer group can exceed the limit of five. In those circumstances, you can further divide by using multiple inputs across multiple consumer groups using the solution described in the following section.
3243

3344
## Number of readers per partition exceeds Event Hubs limit of five
3445

46+
If your streaming query syntax references the same input Event Hub resource multiple times, the job engine can use multiple readers per query from that same consumer group. When there are too many references to the same consumer group, the job can exceed the limit of five and thrown an error. In those circumstances, you can further divide by using multiple inputs across multiple consumer groups using the solution described in the following section.
47+
3548
Scenarios in which the number of readers per partition exceeds the Event Hubs limit of five include the following:
3649

3750
* Multiple SELECT statements: If you use multiple SELECT statements that refer to **same** event hub input, each SELECT statement causes a new receiver to be created.

0 commit comments

Comments
 (0)
Please sign in to comment.