Skip to content

Commit 67a2939

Browse files
committed
update demo/stack resources, airflow-scheduled-job and hbase-hdfs-load-cycling-data demos
1 parent ac98f4c commit 67a2939

File tree

11 files changed

+31
-33
lines changed

11 files changed

+31
-33
lines changed

demos/demos-v2.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ demos:
7070
supportedNamespaces: ["default"]
7171
resourceRequests:
7272
cpu: 8700m
73-
memory: 29746Mi
73+
memory: 42034Mi
7474
pvc: 75Gi # 30Gi for Kafka
7575
nifi-kafka-druid-water-level-data:
7676
description: Demo ingesting water level data into Kafka using NiFi, streaming it into Druid and creating a Superset dashboard
@@ -91,7 +91,7 @@ demos:
9191
supportedNamespaces: ["default"]
9292
resourceRequests:
9393
cpu: 8900m
94-
memory: 30042Mi
94+
memory: 42330Mi
9595
pvc: 75Gi # 30Gi for Kafka
9696
spark-k8s-anomaly-detection-taxi-data:
9797
description: Demo loading New York taxi data into an S3 bucket and carrying out an anomaly detection analysis on it
Loading
Loading
Loading
Loading
Loading
Loading
Loading

docs/modules/demos/pages/airflow-scheduled-job.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@ continuously:
9898

9999
image::airflow-scheduled-job/airflow_7.png[]
100100

101-
Click on the `run_every_minute` box in the centre of the page and then select `Log`:
101+
Click on the `run_every_minute` box in the centre of the page and then select `Logs`:
102102

103103
[WARNING]
104104
====
@@ -118,7 +118,7 @@ image::airflow-scheduled-job/airflow_10.png[]
118118

119119
Go back to DAG overview screen. The `sparkapp_dag` job has a scheduled entry of `None` and a last-execution time
120120
(`2022-09-19, 07:36:55`). This allows a DAG to be executed exactly once, with neither schedule-based runs nor any
121-
https://airflow.apache.org/docs/apache-airflow/stable/dag-run.html?highlight=backfill#backfill[backfill]. The DAG can
121+
https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/dag-run.html#backfill[backfill]. The DAG can
122122
always be triggered manually again via REST or from within the Webserver UI.
123123

124124
image::airflow-scheduled-job/airflow_11.png[]

docs/modules/demos/pages/hbase-hdfs-load-cycling-data.adoc

Lines changed: 26 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -51,33 +51,31 @@ image::hbase-hdfs-load-cycling-data/overview.png[]
5151

5252
To list the installed Stackable services run the following command: `stackablectl stacklet list`
5353

54-
//TODO(Techassi): Update console output
55-
5654
[source,console]
5755
----
5856
$ stackablectl stacklet list
59-
PRODUCT NAME NAMESPACE ENDPOINTS EXTRA INFOS
60-
61-
hbase hbase default regionserver 172.18.0.5:32282
62-
ui http://172.18.0.5:31527
63-
metrics 172.18.0.5:31081
64-
65-
hdfs hdfs default datanode-default-0-metrics 172.18.0.2:31441
66-
datanode-default-0-data 172.18.0.2:32432
67-
datanode-default-0-http http://172.18.0.2:30758
68-
datanode-default-0-ipc 172.18.0.2:32323
69-
journalnode-default-0-metrics 172.18.0.5:31123
70-
journalnode-default-0-http http://172.18.0.5:30038
71-
journalnode-default-0-https https://172.18.0.5:31996
72-
journalnode-default-0-rpc 172.18.0.5:30080
73-
namenode-default-0-metrics 172.18.0.2:32753
74-
namenode-default-0-http http://172.18.0.2:32475
75-
namenode-default-0-rpc 172.18.0.2:31639
76-
namenode-default-1-metrics 172.18.0.4:32202
77-
namenode-default-1-http http://172.18.0.4:31486
78-
namenode-default-1-rpc 172.18.0.4:31874
79-
80-
zookeeper zookeeper default zk 172.18.0.4:32469
57+
58+
┌───────────┬───────────┬───────────┬──────────────────────────────────────────────────────────────┬─────────────────────────────────┐
59+
│ PRODUCT ┆ NAME ┆ NAMESPACE ┆ ENDPOINTS ┆ CONDITIONS │
60+
╞═══════════╪═══════════╪═══════════╪══════════════════════════════════════════════════════════════╪═════════════════════════════════╡
61+
│ hbase ┆ hbase ┆ default ┆ regionserver 172.18.0.2:31521 ┆ Available, Reconciling, Running │
62+
│ ┆ ┆ ┆ ui-http 172.18.0.2:32064 ┆ │
63+
┆ ┆ metrics 172.18.0.2:31372 ┆ │
64+
├╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌┤
65+
│ hdfs ┆ hdfs ┆ default datanode-default-0-listener-data 172.18.0.2:31990 ┆ Available, Reconciling, Running │
66+
datanode-default-0-listener-http http://172.18.0.2:30659 ┆ │
67+
┆ datanode-default-0-listener-ipc 172.18.0.2:30678 ┆ │
68+
┆ datanode-default-0-listener-metrics 172.18.0.2:31531 ┆ │
69+
┆ namenode-default-0-http http://172.18.0.2:32543 ┆ │
70+
┆ namenode-default-0-metrics 172.18.0.2:30098 ┆ │
71+
namenode-default-0-rpc 172.18.0.2:30915 ┆ │
72+
namenode-default-1-http http://172.18.0.2:31333 ┆ │
73+
namenode-default-1-metrics 172.18.0.2:30862 ┆ │
74+
namenode-default-1-rpc 172.18.0.2:31440 ┆ │
75+
├╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌┼╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌╌┤
76+
│ zookeeper ┆ zookeeper ┆ default ┆ Available, Reconciling, Running │
77+
└───────────┴───────────┴───────────┴──────────────────────────────────────────────────────────────┴─────────────────────────────────┘
78+
8179
----
8280

8381
include::partial$instance-hint.adoc[]
@@ -217,18 +215,18 @@ Below you will see the overview of your HDFS cluster.
217215

218216
image::hbase-hdfs-load-cycling-data/hdfs-overview.png[]
219217

220-
The UI will give you information on the datanodes via the datanodes tab.
218+
The UI will give you information on the datanodes via the `Datanodes` tab.
221219

222220
image::hbase-hdfs-load-cycling-data/hdfs-datanode.png[]
223221

224-
You can also browse the filesystem via the Utilities menu.
222+
You can also browse the file system by clicking on the `Utilities` tab and selecting `Browse the file system`.
225223

226224
image::hbase-hdfs-load-cycling-data/hdfs-data.png[]
227225

228-
The raw data from the distcp job can be found here.
226+
Navigate in the file system to the folder `data` and then the `raw` folder. Here you can find the raw data from the distcp job.
229227

230228
image::hbase-hdfs-load-cycling-data/hdfs-data-raw.png[]
231229

232-
The structure of the Hfiles can be seen here.
230+
Selecting the folder `data` and then `hfile` instead, gives you the structure of the Hfiles.
233231

234232
image::hbase-hdfs-load-cycling-data/hdfs-data-hfile.png[]

stacks/stacks-v2.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -220,7 +220,7 @@ stacks:
220220
supportedNamespaces: []
221221
resourceRequests:
222222
cpu: 8900m
223-
memory: 30042Mi
223+
memory: 42330Mi
224224
pvc: 75Gi
225225
parameters:
226226
- name: nifiAdminPassword

0 commit comments

Comments
 (0)