Skip to content

Commit 5e572af

Browse files
xeniapesiegfriedweberTechassirazvanNickLarsenNZ
authored
Chore: update demos for release (#60)
* upgrade postgres and redis versions, upgrade airflow-scheduled-job demo versions * update versions in hbase-hdfs-load-cycling-data and airflow-scheduled-job demos * Update logging demo for the next release * Update signal-processing demo and container image * upgrade minio version and nifi-kafka-druid-superset-s3 versions and druid db credentials * anomaly-detection: update trino, superset, spark products * docs(demos/data-lakehouse-iceberg-trino-spark): update requirements to 12 nodes * docs(demos/data-lakehouse-iceberg-trino-spark): be less specific about how many files appear in minio * docs(demos/data-lakehouse-iceberg-trino-spark): update the link for tpch * docs(demos/data-lakehouse-iceberg-trino-spark): upgrade the NOTE about bug to IMPORTANT, and move it above the image. * bump versions for jupyterhub-pyspark demo * bump opa version for the trino-superset-s3 stack * bump testing-tools in nifi-kafka-druid demos and revert druid version bump * bump end-2-end-security versions * consolidate stackable versions * more version bumps (untested) * bump to 24.7.0 * adapt to release 24.7 * Apply suggestions from code review --------- Co-authored-by: Siegfried Weber <[email protected]> Co-authored-by: Techassi <[email protected]> Co-authored-by: Razvan-Daniel Mihai <[email protected]> Co-authored-by: Nick Larsen <[email protected]> Co-authored-by: Malte Sander <[email protected]> Co-authored-by: Nick <[email protected]>
1 parent e513db2 commit 5e572af

File tree

75 files changed

+141
-124
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

75 files changed

+141
-124
lines changed

demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: start-pyspark-job
11-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1212
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
1313
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
1414
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that

demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: start-date-job
11-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1212
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
1313
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
1414
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that

demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,11 @@ spec:
99
serviceAccountName: demo-serviceaccount
1010
initContainers:
1111
- name: wait-for-kafka
12-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
12+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1313
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/instance=kafka -l app.kubernetes.io/name=kafka"]
1414
containers:
1515
- name: create-nifi-ingestion-job
16-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
16+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1717
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
1818
volumeMounts:
1919
- name: script

demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,11 @@ spec:
1212
serviceAccountName: demo-serviceaccount
1313
initContainers:
1414
- name: wait-for-kafka
15-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
15+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1616
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=kafka -l app.kubernetes.io/instance=kafka"]
1717
containers:
1818
- name: create-spark-ingestion-job
19-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
19+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
2020
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ingestion-job.yaml"]
2121
volumeMounts:
2222
- name: manifest

demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,11 @@ spec:
99
serviceAccountName: demo-serviceaccount
1010
initContainers:
1111
- name: wait-for-testdata
12-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
12+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1313
command: ["bash", "-c", "echo 'Waiting for job load-test-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-test-data"]
1414
containers:
1515
- name: create-tables-in-trino
16-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
16+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1717
command: ["bash", "-c", "python -u /tmp/script/script.py"]
1818
volumeMounts:
1919
- name: script

demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: setup-superset
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/end-to-end-security/create-spark-report.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ spec:
1212
serviceAccountName: demo-serviceaccount
1313
initContainers:
1414
- name: wait-for-trino-tables
15-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
15+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1616
command:
1717
- bash
1818
- -euo
@@ -23,7 +23,7 @@ spec:
2323
kubectl wait --timeout=30m --for=condition=complete job/create-tables-in-trino
2424
containers:
2525
- name: create-spark-report
26-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
26+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
2727
command:
2828
- bash
2929
- -euo

demos/end-to-end-security/create-trino-tables.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-tables-in-trino
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable23.11.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ spec:
99
spec:
1010
containers:
1111
- name: create-hfile-and-import-to-hbase
12-
image: docker.stackable.tech/stackable/hbase:2.4.17-stackable24.3.0
12+
image: docker.stackable.tech/stackable/hbase:2.4.18-stackable24.7.0
1313
env:
1414
- name: HADOOP_USER_NAME
1515
value: stackable

demos/hbase-hdfs-load-cycling-data/distcp-cycling-data.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: distcp-cycling-data
11-
image: docker.stackable.tech/stackable/hadoop:3.3.6-stackable24.3.0
11+
image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable24.7.0
1212
env:
1313
- name: HADOOP_USER_NAME
1414
value: stackable

demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# docker build -t docker.stackable.tech/demos/pyspark-k8s-with-scikit-learn:3.3.0-stackable0.0.0-dev .
22

3-
FROM docker.stackable.tech/stackable/pyspark-k8s:3.5.0-stackable24.3.0
3+
FROM docker.stackable.tech/stackable/pyspark-k8s:3.5.0-stackable24.7.0
44

55
COPY requirements.txt .
66

demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: load-ny-taxi-data
11-
image: docker.stackable.tech/stackable/hadoop:3.3.6-stackable24.3.0
11+
image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable24.7.0
1212
command: ["bash", "-c", "/stackable/hadoop/bin/hdfs dfs -mkdir -p /ny-taxi-data/raw \
1313
&& cd /tmp \
1414
&& for month in 2020-09; do \

demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-druid-ingestion-job
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor"]
1313
volumeMounts:
1414
- name: ingestion-job-spec

demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-nifi-ingestion-job
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: setup-superset
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-druid-ingestion-job
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/stations-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-compaction-job-spec.json https://druid-coordinator:8281/druid/coordinator/v1/config/compaction"]
1313
volumeMounts:
1414
- name: ingestion-job-spec

demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-nifi-ingestion-job
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/nifi-kafka-druid-water-level-data/setup-superset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: setup-superset
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
# docker build -f ./Dockerfile-nifi -t docker.stackable.tech/demos/nifi:1.25.0-postgresql .
1+
# docker build -f ./Dockerfile-nifi -t docker.stackable.tech/demos/nifi:1.27.0-postgresql .
22

3-
FROM docker.stackable.tech/stackable/nifi:1.25.0-stackable24.3.0
3+
FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.7.0
44

55
RUN curl --fail -o /stackable/nifi/postgresql-42.6.0.jar "https://repo.stackable.tech/repository/misc/postgresql-timescaledb/postgresql-42.6.0.jar"

demos/signal-processing/create-nifi-ingestion-job.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,13 +9,13 @@ spec:
99
serviceAccountName: demo-serviceaccount
1010
initContainers:
1111
- name: wait-for-timescale-job
12-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
12+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1313
command: ["bash", "-c", "echo 'Waiting for timescaleDB tables to be ready'
1414
&& kubectl wait --for=condition=complete job/create-timescale-tables-job"
1515
]
1616
containers:
1717
- name: create-nifi-ingestion-job
18-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
18+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1919
command: ["bash", "-c", "export PGPASSWORD=$(cat /timescale-admin-credentials/password) && \
2020
curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/DownloadAndWriteToDB.xml && \
2121
sed -i \"s/PLACEHOLDERPGPASSWORD/$PGPASSWORD/g\" DownloadAndWriteToDB.xml && \

demos/signal-processing/create-timescale-tables.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ spec:
99
serviceAccountName: demo-serviceaccount
1010
initContainers:
1111
- name: wait-for-timescale
12-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
12+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
1313
command: ["bash", "-c", "echo 'Waiting for timescaleDB to be ready'
1414
&& kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=postgresql-timescaledb"
1515
]

demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,11 @@ spec:
88
spec:
99
initContainers:
1010
- name: wait-for-testdata
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "echo 'Waiting for job load-ny-taxi-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-ny-taxi-data"]
1313
containers:
1414
- name: create-spark-anomaly-detection-job
15-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
15+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1616
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ad-job.yaml"]
1717
volumeMounts:
1818
- name: manifest
@@ -37,7 +37,7 @@ data:
3737
name: spark-ad
3838
spec:
3939
sparkImage:
40-
productVersion: 3.5.0
40+
productVersion: 3.5.1
4141
mode: cluster
4242
mainApplicationFile: local:///spark-scripts/spark-ad.py
4343
deps:

demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: setup-superset
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/trino-taxi-data/create-table-in-trino.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: create-ny-taxi-data-table-in-trino
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

demos/trino-taxi-data/setup-superset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
spec:
99
containers:
1010
- name: setup-superset
11-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.3.0
11+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
1212
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
1313
volumeMounts:
1414
- name: script

docs/modules/demos/pages/data-lakehouse-iceberg-trino-spark.adoc

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ $ stackablectl demo install data-lakehouse-iceberg-trino-spark
3434
[#system-requirements]
3535
== System requirements
3636

37-
The demo was developed and tested on a kubernetes cluster with 10 nodes (4 cores (8 threads), 20GiB RAM and 30GB HDD).
37+
The demo was developed and tested on a kubernetes cluster with about 12 nodes (4 cores with hyperthreading/SMT, 20GiB RAM and 30GB HDD).
3838
Instance types that loosely correspond to this on the Hyperscalers are:
3939

4040
- *Google*: `e2-standard-8`
@@ -179,7 +179,7 @@ As you can see, the table `house-sales` is partitioned by year. Go ahead and cli
179179

180180
image::data-lakehouse-iceberg-trino-spark/minio_5.png[]
181181

182-
You can see that Trino has placed a single file into the selected folder containing all the house sales of that
182+
You can see that Trino has data into the selected folder containing all the house sales of that
183183
particular year.
184184

185185
== NiFi
@@ -491,7 +491,7 @@ Here you can see all the available Trino catalogs.
491491
* `tpcds`: https://trino.io/docs/current/connector/tpcds.html[TPCDS connector] providing a set of schemas to support the
492492
http://www.tpc.org/tpcds/[TPC Benchmark™ DS]
493493
* `tpch`: https://trino.io/docs/current/connector/tpch.html[TPCH connector] providing a set of schemas to support the
494-
http://www.tpc.org/tpcds/[TPC Benchmark™ DS]
494+
http://www.tpc.org/tpch/[TPC Benchmark™ H]
495495
* `lakehouse`: The lakehouse area containing the enriched and performant accessible data
496496

497497
== Superset
@@ -541,14 +541,14 @@ image::data-lakehouse-iceberg-trino-spark/superset_7.png[]
541541
On the left, select the database `Trino lakehouse`, the schema `house_sales`, and set `See table schema` to
542542
`house_sales`.
543543

544-
image::data-lakehouse-iceberg-trino-spark/superset_8.png[]
545-
546-
[NOTE]
544+
[IMPORTANT]
547545
====
548-
This older screenshot shows how the table preview would look like. Currently, there is an https://github.com/apache/superset/issues/25307[open issue]
546+
The older screenshot below shows how the table preview would look like. Currently, there is an https://github.com/apache/superset/issues/25307[open issue]
549547
with previewing trino tables using the Iceberg connector. This doesn't affect the execution the following execution of the SQL statement.
550548
====
551549

550+
image::data-lakehouse-iceberg-trino-spark/superset_8.png[]
551+
552552
In the right textbox, you can enter the desired SQL statement. If you want to avoid making one up, use the following:
553553

554554
[source,sql]

docs/modules/demos/pages/logging.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,10 +46,10 @@ This demo will
4646
synchronization, and providing group services. This demo makes its log data observable in OpenSearch Dashboards.
4747
** *Vector*: A tool for building observability pipelines. This demo uses Vector as a log agent to gather and transform
4848
the logs and as an aggregator to forward the collected logs to OpenSearch.
49-
** *OpenSearch*: A data store and search engine. This demo uses it to store and index the of the log data.
49+
** *OpenSearch*: A data store and search engine. This demo uses it to store and index the log data.
5050
** *OpenSearch Dashboards*: A visualization and user interface. This demo uses it to make the log data easily accessible
5151
to the user.
52-
* Create a view in OpenSearch Dashboards for convenient browsing the log data.
52+
* Create a view in OpenSearch Dashboards to conveniently browse the log data.
5353

5454
You can see the deployed products and their relationship in the following diagram:
5555

docs/modules/demos/pages/spark-k8s-anomaly-detection-taxi-data.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ This demo should not be run alongside other demos.
2323

2424
To run this demo, your system needs at least:
2525

26-
* 8 {k8s-cpu}[cpu units] (core/hyperthread)
26+
* 10 {k8s-cpu}[cpu units] (core/hyperthread)
2727
* 32GiB memory
2828
* 35GiB disk storage
2929

stacks/_templates/jupyterhub.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ name: jupyterhub
33
repo:
44
name: jupyterhub
55
url: https://jupyterhub.github.io/helm-chart/
6-
version: 3.2.1
6+
version: 3.3.7
77
options:
88
hub:
99
config:
@@ -49,7 +49,7 @@ options:
4949
HADOOP_CONF_DIR: "/home/jovyan/hdfs"
5050
initContainers:
5151
- name: download-notebook
52-
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.3.0
52+
image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
5353
command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
5454
volumeMounts:
5555
- mountPath: /notebook

stacks/_templates/keycloak.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ spec:
4848
- name: tls
4949
mountPath: /tls/
5050
- name: create-auth-class
51-
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable0.0.0-dev # We need 0.0.0-dev, so we get kubectl
51+
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
5252
command: ["/bin/bash", "-c"]
5353
args:
5454
- |

stacks/_templates/minio-distributed-small.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ name: minio
33
repo:
44
name: minio
55
url: https://charts.min.io/
6-
version: 5.0.14
6+
version: 5.2.0
77
options:
88
additionalLabels:
99
stackable.tech/vendor: Stackable

stacks/_templates/minio-distributed.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ name: minio
33
repo:
44
name: minio
55
url: https://charts.min.io/
6-
version: 5.0.14
6+
version: 5.2.0
77
options:
88
additionalLabels:
99
stackable.tech/vendor: Stackable

stacks/_templates/minio.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ name: minio
33
repo:
44
name: minio
55
url: https://charts.min.io/
6-
version: 5.0.14
6+
version: 5.2.0
77
options:
88
additionalLabels:
99
stackable.tech/vendor: Stackable

0 commit comments

Comments
 (0)