-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update refs script #132
Merged
Merged
Update refs script #132
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…replacable on release
stackableRelease updates in stacks-v2.yamldiff --git a/stacks/stacks-v2.yaml b/stacks/stacks-v2.yaml
index 12a55e9..6a5d10e 100644
--- a/stacks/stacks-v2.yaml
+++ b/stacks/stacks-v2.yaml
@@ -2,7 +2,7 @@
stacks:
monitoring:
description: Stack containing Prometheus and Grafana
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -25,7 +25,7 @@ stacks:
default: adminadmin
logging:
description: Stack containing OpenSearch, OpenSearch Dashboards (Kibana) and Vector aggregator
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -60,7 +60,7 @@ stacks:
observability:
description: >-
An observability stack with auto-injection of the opentelemetry-collector sidecar to receive traces/logs/metrics via OTLP, and send them to Jaeger/Tempo/Loki.
- stackableRelease: dev
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -85,7 +85,7 @@ stacks:
default: adminadmin
airflow:
description: Stack containing Airflow scheduling platform
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -112,7 +112,7 @@ stacks:
default: airflowSecretKey
data-lakehouse-iceberg-trino-spark:
description: Data lakehouse using Iceberg lakehouse on S3, Trino as query engine, Spark for streaming ingest and Superset for data visualization
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -169,7 +169,7 @@ stacks:
default: supersetSecretKey
hdfs-hbase:
description: HBase cluster using HDFS as underlying storage
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -192,7 +192,7 @@ stacks:
parameters: []
nifi-kafka-druid-superset-s3:
description: Stack containing NiFi, Kafka, Druid, MinIO and Superset for data visualization
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -238,7 +238,7 @@ stacks:
default: adminadmin
spark-trino-superset-s3:
description: Stack containing MinIO, Trino and Superset for data visualization
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -283,7 +283,7 @@ stacks:
default: supersetSecretKey
trino-superset-s3:
description: Stack containing MinIO, Trino and Superset for data visualization
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -325,7 +325,7 @@ stacks:
default: supersetSecretKey
trino-iceberg:
description: Stack containing Trino using Apache Iceberg as a S3 data lakehouse
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -359,7 +359,7 @@ stacks:
default: adminadmin
jupyterhub-pyspark-hdfs:
description: Jupyterhub with PySpark and HDFS integration
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -389,7 +389,7 @@ stacks:
default: adminadmin
dual-hive-hdfs-s3:
description: Dual stack Hive on HDFS and S3 for Hadoop/Hive to Trino migration
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -426,7 +426,7 @@ stacks:
The bind user credentials are: ldapadmin:ldapadminpassword.
No AuthenticationClass is configured, The AuthenticationClass is created manually in the tutorial.
Use the 'openldap' Stack for an OpenLDAD with an AuthenticationClass already installed.
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -449,7 +449,7 @@ stacks:
The bind user credentials are: ldapadmin:ldapadminpassword.
The LDAP AuthenticationClass is called 'ldap' and the SecretClass for the bind credentials is called 'ldap-bind-credentials'.
The stack already creates an appropriate Secret, so referring to the 'ldap' AuthenticationClass in your ProductCluster should be enough.
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -475,7 +475,7 @@ stacks:
3 users are created in Keycloak: admin:adminadmin, alice:alicealice, bob:bobbob. admin and alice are admins with
full authorization in Druid and Trino, bob is not authorized.
This is a proof-of-concept and the mechanisms used here are subject to change.
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -541,7 +541,7 @@ stacks:
Note that this stack is tightly coupled with the demo.
So if you install the stack you will get demo-specific parts (such as Keycloak users or regorules).
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
@@ -611,7 +611,7 @@ stacks:
signal-processing:
description: >-
A stack used for creating, streaming and processing in-flight data and persisting it to TimescaleDB before it is displayed in Grafana
- stackableRelease: 24.7
+ stackableRelease: 1.23
stackableOperators:
- commons
- listener
update image refsdiff --git a/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml b/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml
index dd65085..6444ee4 100644
--- a/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml
+++ b/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-pyspark-job
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
diff --git a/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml b/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml
index b5e9ba8..472a93d 100644
--- a/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml
+++ b/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: start-date-job
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
# N.B. it is possible for the scheduler to report that a DAG exists, only for the worker task to fail if a pod is unexpectedly
# restarted. Additionally, the db-init job takes a few minutes to complete before the cluster is deployed. The wait/watch steps
# below are not "water-tight" but add a layer of stability by at least ensuring that the db is initialized and ready and that
diff --git a/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml b/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
index 277c600..840193a 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
@@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/instance=kafka -l app.kubernetes.io/name=kafka"]
containers:
- name: create-nifi-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml b/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
index 423f0fa..d8a65fe 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
@@ -12,11 +12,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-kafka
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for all kafka brokers to be ready' && kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=kafka -l app.kubernetes.io/instance=kafka"]
containers:
- name: create-spark-ingestion-job
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ingestion-job.yaml"]
volumeMounts:
- name: manifest
diff --git a/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml b/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
index 8cb6e3d..a83a2d9 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
@@ -9,11 +9,11 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-testdata
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for job load-test-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-test-data"]
containers:
- name: create-tables-in-trino
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml b/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
index d5fdff6..b31fcbc 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/end-to-end-security/create-spark-report.yaml b/demos/end-to-end-security/create-spark-report.yaml
index c72845e..696e1e0 100644
--- a/demos/end-to-end-security/create-spark-report.yaml
+++ b/demos/end-to-end-security/create-spark-report.yaml
@@ -12,7 +12,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-trino-tables
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command:
- bash
- -euo
@@ -23,7 +23,7 @@ spec:
kubectl wait --timeout=30m --for=condition=complete job/create-tables-in-trino
containers:
- name: create-spark-report
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command:
- bash
- -euo
diff --git a/demos/end-to-end-security/create-trino-tables.yaml b/demos/end-to-end-security/create-trino-tables.yaml
index 7c488d5..7655b01 100644
--- a/demos/end-to-end-security/create-trino-tables.yaml
+++ b/demos/end-to-end-security/create-trino-tables.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-tables-in-trino
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml b/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml
index 7c561ed..6999423 100644
--- a/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml
+++ b/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml
@@ -9,7 +9,7 @@ spec:
spec:
containers:
- name: create-hfile-and-import-to-hbase
- image: docker.stackable.tech/stackable/hbase:2.4.18-stackable24.7.0
+ image: docker.stackable.tech/stackable/hbase:2.4.18-stackable1.23.0
env:
- name: HADOOP_USER_NAME
value: stackable
diff --git a/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml b/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml
index d02c508..6a5151d 100644
--- a/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml
+++ b/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: load-ny-taxi-data
- image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable0.0.0-dev
+ image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable1.23.0
# yamllint disable rule:line-length
command: ["bash", "-c", "/stackable/hadoop/bin/hdfs dfs -mkdir -p /ny-taxi-data/raw \
&& cd /tmp \
diff --git a/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml b/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml
index 3416ed9..91977cb 100644
--- a/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor"]
volumeMounts:
- name: ingestion-job-spec
diff --git a/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml b/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
index 231d881..be82a6c 100644
--- a/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml b/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
index b52a2ad..75a3155 100644
--- a/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
+++ b/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml b/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml
index 3c2d620..1455071 100644
--- a/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-druid-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/stations-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-ingestion-job-spec.json https://druid-coordinator:8281/druid/indexer/v1/supervisor && curl -X POST --insecure -H 'Content-Type: application/json' -d @/tmp/ingestion-job-spec/measurements-compaction-job-spec.json https://druid-coordinator:8281/druid/coordinator/v1/config/compaction"]
volumeMounts:
- name: ingestion-job-spec
diff --git a/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml b/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
index 6795a68..db32dd3 100644
--- a/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-nifi-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml b/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
index 6cf44c5..d939f0f 100644
--- a/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
+++ b/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/signal-processing/Dockerfile-nifi b/demos/signal-processing/Dockerfile-nifi
index db643c3..f2ce34d 100644
--- a/demos/signal-processing/Dockerfile-nifi
+++ b/demos/signal-processing/Dockerfile-nifi
@@ -1,3 +1,3 @@
-FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.7.0
+FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable1.23.0
RUN curl --fail -o /stackable/nifi/postgresql-42.6.0.jar "https://repo.stackable.tech/repository/misc/postgresql-timescaledb/postgresql-42.6.0.jar"
diff --git a/demos/signal-processing/create-nifi-ingestion-job.yaml b/demos/signal-processing/create-nifi-ingestion-job.yaml
index 51179a5..ae57ef3 100644
--- a/demos/signal-processing/create-nifi-ingestion-job.yaml
+++ b/demos/signal-processing/create-nifi-ingestion-job.yaml
@@ -9,13 +9,13 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale-job
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for timescaleDB tables to be ready'
&& kubectl wait --for=condition=complete job/create-timescale-tables-job"
]
containers:
- name: create-nifi-ingestion-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "export PGPASSWORD=$(cat /timescale-admin-credentials/password) && \
curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/DownloadAndWriteToDB.xml && \
sed -i \"s/PLACEHOLDERPGPASSWORD/$PGPASSWORD/g\" DownloadAndWriteToDB.xml && \
diff --git a/demos/signal-processing/create-timescale-tables.yaml b/demos/signal-processing/create-timescale-tables.yaml
index 61089f3..417a5ec 100644
--- a/demos/signal-processing/create-timescale-tables.yaml
+++ b/demos/signal-processing/create-timescale-tables.yaml
@@ -9,7 +9,7 @@ spec:
serviceAccountName: demo-serviceaccount
initContainers:
- name: wait-for-timescale
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for timescaleDB to be ready'
&& kubectl wait --for=condition=ready --timeout=30m pod -l app.kubernetes.io/name=postgresql-timescaledb"
]
diff --git a/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml b/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml
index 5dce76c..a80af3a 100644
--- a/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml
+++ b/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml
@@ -8,11 +8,11 @@ spec:
spec:
initContainers:
- name: wait-for-testdata
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "echo 'Waiting for job load-ny-taxi-data to finish' && kubectl wait --for=condition=complete --timeout=30m job/load-ny-taxi-data"]
containers:
- name: create-spark-anomaly-detection-job
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "echo 'Submitting Spark job' && kubectl apply -f /tmp/manifest/spark-ad-job.yaml"]
volumeMounts:
- name: manifest
diff --git a/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml b/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
index 36aba95..624efd1 100644
--- a/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
+++ b/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/trino-taxi-data/create-table-in-trino.yaml b/demos/trino-taxi-data/create-table-in-trino.yaml
index d45ce7d..9c0f8c2 100644
--- a/demos/trino-taxi-data/create-table-in-trino.yaml
+++ b/demos/trino-taxi-data/create-table-in-trino.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: create-ny-taxi-data-table-in-trino
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/demos/trino-taxi-data/setup-superset.yaml b/demos/trino-taxi-data/setup-superset.yaml
index 2c94efd..6a9a59b 100644
--- a/demos/trino-taxi-data/setup-superset.yaml
+++ b/demos/trino-taxi-data/setup-superset.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-superset
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
diff --git a/stacks/_templates/jupyterhub.yaml b/stacks/_templates/jupyterhub.yaml
index fd4bbd8..a33fafc 100644
--- a/stacks/_templates/jupyterhub.yaml
+++ b/stacks/_templates/jupyterhub.yaml
@@ -50,7 +50,7 @@ options:
HADOOP_CONF_DIR: "/home/jovyan/hdfs"
initContainers:
- name: download-notebook
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
volumeMounts:
- mountPath: /notebook
diff --git a/stacks/_templates/keycloak.yaml b/stacks/_templates/keycloak.yaml
index ecc9a9f..c40b7be 100644
--- a/stacks/_templates/keycloak.yaml
+++ b/stacks/_templates/keycloak.yaml
@@ -48,7 +48,7 @@ spec:
- name: tls
mountPath: /tls/
- name: create-auth-class
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["/bin/bash", "-c"]
args:
- |
diff --git a/stacks/end-to-end-security/krb5.yaml b/stacks/end-to-end-security/krb5.yaml
index 1965723..2962f9d 100644
--- a/stacks/end-to-end-security/krb5.yaml
+++ b/stacks/end-to-end-security/krb5.yaml
@@ -14,7 +14,7 @@ spec:
spec:
initContainers:
- name: init
- image: docker.stackable.tech/stackable/krb5:1.21.1-stackable24.7.0
+ image: docker.stackable.tech/stackable/krb5:1.21.1-stackable1.23.0
args:
- sh
- -euo
@@ -35,7 +35,7 @@ spec:
name: data
containers:
- name: kdc
- image: docker.stackable.tech/stackable/krb5:1.21.1-stackable24.7.0
+ image: docker.stackable.tech/stackable/krb5:1.21.1-stackable1.23.0
args:
- krb5kdc
- -n
@@ -48,7 +48,7 @@ spec:
- mountPath: /var/kerberos/krb5kdc
name: data
- name: kadmind
- image: docker.stackable.tech/stackable/krb5:1.21.1-stackable24.7.0
+ image: docker.stackable.tech/stackable/krb5:1.21.1-stackable1.23.0
args:
- kadmind
- -nofork
@@ -61,7 +61,7 @@ spec:
- mountPath: /var/kerberos/krb5kdc
name: data
- name: client
- image: docker.stackable.tech/stackable/krb5:1.21.1-stackable24.7.0
+ image: docker.stackable.tech/stackable/krb5:1.21.1-stackable1.23.0
tty: true
stdin: true
env:
diff --git a/stacks/end-to-end-security/superset.yaml b/stacks/end-to-end-security/superset.yaml
index 0577245..b68da47 100644
--- a/stacks/end-to-end-security/superset.yaml
+++ b/stacks/end-to-end-security/superset.yaml
@@ -25,7 +25,7 @@ spec:
initContainers:
# The postgres image does not contain curl or wget...
- name: download-dump
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command:
- bash
- -c
diff --git a/stacks/keycloak-opa-poc/keycloak.yaml b/stacks/keycloak-opa-poc/keycloak.yaml
index a6c2e22..3e879a2 100644
--- a/stacks/keycloak-opa-poc/keycloak.yaml
+++ b/stacks/keycloak-opa-poc/keycloak.yaml
@@ -70,7 +70,7 @@ spec:
spec:
containers:
- name: propagate-keycloak-address
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command:
- bash
- -x
diff --git a/stacks/keycloak-opa-poc/setup-keycloak.yaml b/stacks/keycloak-opa-poc/setup-keycloak.yaml
index f21d64a..0c1f634 100644
--- a/stacks/keycloak-opa-poc/setup-keycloak.yaml
+++ b/stacks/keycloak-opa-poc/setup-keycloak.yaml
@@ -29,7 +29,7 @@ spec:
spec:
containers:
- name: setup-keycloak
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
env:
- name: KEYCLOAK_ADMIN_PASSWORD
valueFrom:
diff --git a/stacks/logging/setup-opensearch-dashboards.yaml b/stacks/logging/setup-opensearch-dashboards.yaml
index c3b4330..f07591b 100644
--- a/stacks/logging/setup-opensearch-dashboards.yaml
+++ b/stacks/logging/setup-opensearch-dashboards.yaml
@@ -8,7 +8,7 @@ spec:
spec:
containers:
- name: setup-opensearch-dashboards
- image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
env:
- name: OPEN_SEARCH_ADMIN_PASSWORD
valueFrom:
diff --git a/stacks/signal-processing/jupyterhub.yaml b/stacks/signal-processing/jupyterhub.yaml
index f26e598..e73ae48 100644
--- a/stacks/signal-processing/jupyterhub.yaml
+++ b/stacks/signal-processing/jupyterhub.yaml
@@ -30,7 +30,7 @@ options:
singleuser:
cmd: null
image:
- # TODO (@NickLarsenNZ): Use a versioned image with stackable0.0.0-dev or stackableXX.X.X so that
+ # TODO (@NickLarsenNZ): Use a versioned image with stackable1.23.0 or stackableXX.X.X so that
# the demo is reproducable for the release and it will be automatically replaced for the release branch.
name: docker.stackable.tech/demos/jupyter-pyspark-with-alibi-detect
tag: python-3.9
@@ -41,7 +41,7 @@ options:
stackable.tech/vendor: Stackable
initContainers:
- name: download-notebook
- image: docker.stackable.tech/stackable/tools:1.0.0-stackable24.7.0
+ image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/tsdb.ipynb -o /notebook/tsdb.ipynb']
volumeMounts:
- mountPath: /notebook
diff --git a/stacks/signal-processing/nifi.yaml b/stacks/signal-processing/nifi.yaml
index 48d7c39..7e06213 100644
--- a/stacks/signal-processing/nifi.yaml
+++ b/stacks/signal-processing/nifi.yaml
@@ -6,7 +6,7 @@ metadata:
spec:
image:
productVersion: 1.27.0
- # TODO (@NickLarsenNZ): Use a versioned image with stackable0.0.0-dev or stackableXX.X.X so that
+ # TODO (@NickLarsenNZ): Use a versioned image with stackable1.23.0 or stackableXX.X.X so that
# the demo is reproducable for the release and it will be automatically replaced for the release branch.
custom: docker.stackable.tech/demos/nifi:1.27.0-postgresql
clusterConfig:
update githubusercontent branch refsdiff --git a/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml b/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
index 840193a..4059b4d 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
@@ -14,7 +14,7 @@ spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/LakehouseKafkaIngest.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml b/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
index b31fcbc..4e0e815 100644
--- a/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
+++ b/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/demos-v2.yaml b/demos/demos-v2.yaml
index a31ff1e..7481831 100644
--- a/demos/demos-v2.yaml
+++ b/demos/demos-v2.yaml
@@ -7,10 +7,10 @@ demos:
- airflow
- job-scheduling
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/airflow-scheduled-job/01-airflow-spark-clusterrole.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/airflow-scheduled-job/02-airflow-spark-clusterrolebinding.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/airflow-scheduled-job/01-airflow-spark-clusterrole.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/airflow-scheduled-job/02-airflow-spark-clusterrolebinding.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/airflow-scheduled-job/03-enable-and-run-spark-dag.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/airflow-scheduled-job/04-enable-and-run-date-dag.yaml
supportedNamespaces: []
resourceRequests:
cpu: 2401m
@@ -24,8 +24,8 @@ demos:
- hdfs
- cycling-tripdata
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/hbase-hdfs-load-cycling-data/distcp-cycling-data.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/hbase-hdfs-load-cycling-data/distcp-cycling-data.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml
supportedNamespaces: []
resourceRequests:
cpu: "3"
@@ -43,9 +43,9 @@ demos:
- opa
- keycloak
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/end-to-end-security/create-trino-tables.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/end-to-end-security/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/end-to-end-security/create-spark-report.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/end-to-end-security/create-trino-tables.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/end-to-end-security/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/end-to-end-security/create-spark-report.yaml
supportedNamespaces: []
resourceRequests:
cpu: 9000m
@@ -64,9 +64,9 @@ demos:
- s3
- earthquakes
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-earthquake-data/create-druid-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
supportedNamespaces: ["default"]
resourceRequests:
cpu: 8700m
@@ -85,9 +85,9 @@ demos:
- s3
- water-levels
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-water-level-data/create-druid-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
supportedNamespaces: ["default"]
resourceRequests:
cpu: 8900m
@@ -104,10 +104,10 @@ demos:
- s3
- ny-taxi-data
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/load-test-data.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/spark-k8s-anomaly-detection-taxi-data/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/spark-k8s-anomaly-detection-taxi-data/load-test-data.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
supportedNamespaces: []
resourceRequests:
cpu: 6400m
@@ -139,9 +139,9 @@ demos:
- s3
- ny-taxi-data
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/load-test-data.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/create-table-in-trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/setup-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/trino-taxi-data/load-test-data.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/trino-taxi-data/create-table-in-trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/trino-taxi-data/setup-superset.yaml
supportedNamespaces: []
resourceRequests:
cpu: 6800m
@@ -164,12 +164,12 @@ demos:
- water-levels
- earthquakes
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/load-test-data.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/load-test-data.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/create-nifi-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/data-lakehouse-iceberg-trino-spark/setup-superset.yaml
supportedNamespaces: ["default"]
resourceRequests:
cpu: "80"
@@ -185,7 +185,7 @@ demos:
- pyspark
- ny-taxi-data
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/load-test-data.yaml
supportedNamespaces: []
resourceRequests:
cpu: 3350m
@@ -202,7 +202,7 @@ demos:
- vector
- zookeeper
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/logging/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/logging/zookeeper.yaml
supportedNamespaces: []
resourceRequests:
cpu: 6500m
@@ -218,9 +218,9 @@ demos:
- grafana-dashboards
- zookeeper
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/create-timescale-tables.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/create-nifi-ingestion-job.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/signal-processing/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/signal-processing/create-timescale-tables.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/signal-processing/create-nifi-ingestion-job.yaml
supportedNamespaces: []
resourceRequests:
cpu: "3"
diff --git a/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml b/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
index be82a6c..ffc85d3 100644
--- a/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-earthquake-data/create-nifi-ingestion-job.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-earthquake-data/IngestEarthquakesToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml b/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
index 75a3155..f0e4037 100644
--- a/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
+++ b/demos/nifi-kafka-druid-earthquake-data/setup-superset.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-earthquake-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml b/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
index db32dd3..80c0a5c 100644
--- a/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
+++ b/demos/nifi-kafka-druid-water-level-data/create-nifi-ingestion-job.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -O https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-water-level-data/IngestWaterLevelsToKafka.xml && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml b/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
index d939f0f..b3c8f05 100644
--- a/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
+++ b/demos/nifi-kafka-druid-water-level-data/setup-superset.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/nifi-kafka-druid-water-level-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/signal-processing/create-nifi-ingestion-job.yaml b/demos/signal-processing/create-nifi-ingestion-job.yaml
index ae57ef3..a53a264 100644
--- a/demos/signal-processing/create-nifi-ingestion-job.yaml
+++ b/demos/signal-processing/create-nifi-ingestion-job.yaml
@@ -17,7 +17,7 @@ spec:
- name: create-nifi-ingestion-job
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
command: ["bash", "-c", "export PGPASSWORD=$(cat /timescale-admin-credentials/password) && \
- curl -O https://raw.githubusercontent.com/stackabletech/demos/main/demos/signal-processing/DownloadAndWriteToDB.xml && \
+ curl -O https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/signal-processing/DownloadAndWriteToDB.xml && \
sed -i \"s/PLACEHOLDERPGPASSWORD/$PGPASSWORD/g\" DownloadAndWriteToDB.xml && \
python -u /tmp/script/script.py"]
volumeMounts:
diff --git a/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml b/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
index 624efd1..5a88025 100644
--- a/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
+++ b/demos/spark-k8s-anomaly-detection-taxi-data/setup-superset.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/spark-k8s-anomaly-detection-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/demos/trino-taxi-data/setup-superset.yaml b/demos/trino-taxi-data/setup-superset.yaml
index 6a9a59b..26ba204 100644
--- a/demos/trino-taxi-data/setup-superset.yaml
+++ b/demos/trino-taxi-data/setup-superset.yaml
@@ -9,7 +9,7 @@ spec:
containers:
- name: setup-superset
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable1.23.0
- command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/main/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
+ command: ["bash", "-c", "curl -o superset-assets.zip https://raw.githubusercontent.com/stackabletech/demos/release-1.23/demos/trino-taxi-data/superset-assets.zip && python -u /tmp/script/script.py"]
volumeMounts:
- name: script
mountPath: /tmp/script
diff --git a/stacks/_templates/jupyterhub.yaml b/stacks/_templates/jupyterhub.yaml
index a33fafc..ef9885b 100644
--- a/stacks/_templates/jupyterhub.yaml
+++ b/stacks/_templates/jupyterhub.yaml
@@ -51,7 +51,7 @@ options:
initContainers:
- name: download-notebook
image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
- command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
+ command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/notebook.ipynb -o /notebook/notebook.ipynb']
volumeMounts:
- mountPath: /notebook
name: notebook
diff --git a/stacks/end-to-end-security/superset.yaml b/stacks/end-to-end-security/superset.yaml
index b68da47..6bc3e96 100644
--- a/stacks/end-to-end-security/superset.yaml
+++ b/stacks/end-to-end-security/superset.yaml
@@ -31,7 +31,7 @@ spec:
- -c
- |
cd /tmp
- curl --fail -O https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/postgres_superset_dump.sql.gz
+ curl --fail -O https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/postgres_superset_dump.sql.gz
gunzip postgres_superset_dump.sql.gz
# We need to omit changing the users password, as otherwise the content in the Secrets does not match
diff --git a/stacks/signal-processing/jupyterhub.yaml b/stacks/signal-processing/jupyterhub.yaml
index e73ae48..cf170e0 100644
--- a/stacks/signal-processing/jupyterhub.yaml
+++ b/stacks/signal-processing/jupyterhub.yaml
@@ -42,7 +42,7 @@ options:
initContainers:
- name: download-notebook
image: docker.stackable.tech/stackable/tools:1.0.0-stackable1.23.0
- command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/tsdb.ipynb -o /notebook/tsdb.ipynb']
+ command: ['sh', '-c', 'curl https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/tsdb.ipynb -o /notebook/tsdb.ipynb']
volumeMounts:
- mountPath: /notebook
name: notebook
diff --git a/stacks/stacks-v2.yaml b/stacks/stacks-v2.yaml
index 6a5d10e..21b2037 100644
--- a/stacks/stacks-v2.yaml
+++ b/stacks/stacks-v2.yaml
@@ -11,9 +11,9 @@ stacks:
- prometheus
- grafana
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/monitoring/grafana-dashboards.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/prometheus.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/prometheus-service-monitor.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/monitoring/grafana-dashboards.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/prometheus.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/prometheus-service-monitor.yaml
supportedNamespaces: []
resourceRequests:
cpu: 1750m
@@ -37,11 +37,11 @@ stacks:
- opensearch-dashboards
- vector
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/opensearch.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/opensearch-dashboards.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/logging/setup-opensearch-dashboards.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/vector-aggregator.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/vector-aggregator-discovery.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/opensearch.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/opensearch-dashboards.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/logging/setup-opensearch-dashboards.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/vector-aggregator.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/vector-aggregator-discovery.yaml
supportedNamespaces: []
resourceRequests:
cpu: 5150m
@@ -71,14 +71,14 @@ stacks:
- observability
- tracing
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/jaeger.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/opentelemetry-operator.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/grafana-admin-credentials.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/grafana.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/grafana-tempo.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/grafana-loki.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/opentelemetry-collector-sidecar.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/observability/opentelemetry-collector-deployment.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/jaeger.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/opentelemetry-operator.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/grafana-admin-credentials.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/grafana.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/grafana-tempo.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/grafana-loki.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/opentelemetry-collector-sidecar.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/observability/opentelemetry-collector-deployment.yaml
parameters:
- name: grafanaAdminPassword
description: Password of the Grafana admin user
@@ -95,9 +95,9 @@ stacks:
labels:
- airflow
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-airflow.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/redis-airflow.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/airflow/airflow.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-airflow.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/redis-airflow.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/airflow/airflow.yaml
supportedNamespaces: []
resourceRequests:
cpu: 3400m
@@ -135,17 +135,17 @@ stacks:
- minio
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio-distributed.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive-iceberg.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/s3-connection.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/hive-metastores.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/kafka.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/data-lakehouse-iceberg-trino-spark/nifi.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/superset.yaml # Reuse
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio-distributed.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive-iceberg.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/s3-connection.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/hive-metastores.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/kafka.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/data-lakehouse-iceberg-trino-spark/nifi.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/superset.yaml # Reuse
supportedNamespaces: []
resourceRequests:
cpu: "71"
@@ -181,9 +181,9 @@ stacks:
- hbase
- hdfs
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/hdfs-hbase/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/hdfs-hbase/hdfs.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/hdfs-hbase/hbase.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/hdfs-hbase/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/hdfs-hbase/hdfs.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/hdfs-hbase/hbase.yaml
supportedNamespaces: []
resourceRequests:
cpu: 4200m
@@ -210,14 +210,14 @@ stacks:
- minio
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-druid.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/kafka.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/druid.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/nifi.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-druid.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/kafka.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/druid.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/nifi.yaml
supportedNamespaces: []
resourceRequests:
cpu: 8900m
@@ -254,15 +254,15 @@ stacks:
- minio
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive-iceberg.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/s3-connection.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/spark-trino-superset-s3/hive-metastore.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/spark-trino-superset-s3/trino-prediction-catalog.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/superset.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive-iceberg.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/s3-connection.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/spark-trino-superset-s3/hive-metastore.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/spark-trino-superset-s3/trino-prediction-catalog.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/superset.yaml
supportedNamespaces: []
resourceRequests:
cpu: 7100m
@@ -298,13 +298,13 @@ stacks:
- minio
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/s3-connection.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/hive-metastore.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-superset-s3/superset.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/s3-connection.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/hive-metastore.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-superset-s3/superset.yaml
supportedNamespaces: []
resourceRequests:
cpu: 6800m
@@ -340,11 +340,11 @@ stacks:
- minio
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio-distributed-small.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive-iceberg.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-iceberg/s3-connection.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-iceberg/hive-metastores.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/trino-iceberg/trino.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio-distributed-small.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive-iceberg.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-iceberg/s3-connection.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-iceberg/hive-metastores.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/trino-iceberg/trino.yaml
supportedNamespaces: []
resourceRequests:
cpu: 6000m # Measured 5600m
@@ -372,12 +372,12 @@ stacks:
- hdfs
- pyspark
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/jupyterhub.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/hdfs.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/serviceaccount.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/jupyterhub.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/hdfs.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/serviceaccount.yaml
# TODO Use patched JHub that created service for us from customer setup (ask Sebastian)
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/spark_driver_service.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/spark_driver_service.yaml
supportedNamespaces: []
resourceRequests:
cpu: 3350m
@@ -405,12 +405,12 @@ stacks:
- hdfs
- s3
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hivehdfs.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hives3.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/minio.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/dual-hive-hdfs-s3/hdfs.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/dual-hive-hdfs-s3/hive.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/dual-hive-hdfs-s3/trino.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hivehdfs.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hives3.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/minio.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/dual-hive-hdfs-s3/hdfs.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/dual-hive-hdfs-s3/hive.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/dual-hive-hdfs-s3/trino.yaml
supportedNamespaces: []
resourceRequests:
cpu: 7750m
@@ -435,7 +435,7 @@ stacks:
- authentication
- ldap
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/authentication/openldap-tls.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/authentication/openldap-tls.yaml
supportedNamespaces: ["default"]
resourceRequests:
cpu: 1950m
@@ -458,8 +458,8 @@ stacks:
- authentication
- ldap
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/authentication/openldap-tls.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/authentication/openldap-tls-authenticationclass.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/authentication/openldap-tls.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/authentication/openldap-tls-authenticationclass.yaml
supportedNamespaces: []
resourceRequests:
cpu: 1950m
@@ -490,18 +490,18 @@ stacks:
- authentication
- sso
manifests:
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-druid.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/keycloak.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/setup-keycloak.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/opa.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/policies.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/hdfs.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/druid.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/keycloak-opa-poc/superset.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-druid.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/keycloak.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/setup-keycloak.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/opa.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/policies.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/hdfs.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/druid.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/keycloak-opa-poc/superset.yaml
supportedNamespaces: ["default"] # ClusterRoleBinding needs explicit namespace
resourceRequests:
cpu: 7850m
@@ -567,22 +567,22 @@ stacks:
memory: 19586Mi
pvc: 40Gi
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/keycloak-serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/keycloak-realm-config.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/keycloak.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-hive-iceberg.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-superset.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/krb5.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/kerberos-secretclass.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/opa.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/zookeeper.yaml # TODO: Add authentication
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/hdfs.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/hdfs-regorules.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/hive-metastore.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/trino.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/trino-regorules.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/trino-policies.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/end-to-end-security/superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/keycloak-serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/keycloak-realm-config.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/keycloak.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-hive-iceberg.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-superset.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/krb5.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/kerberos-secretclass.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/opa.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/zookeeper.yaml # TODO: Add authentication
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/hdfs.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/hdfs-regorules.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/hive-metastore.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/trino.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/trino-regorules.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/trino-policies.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/end-to-end-security/superset.yaml
parameters:
- name: keycloakAdminPassword
description: Password of the Keycloak admin user
@@ -624,15 +624,15 @@ stacks:
- jupyterhub
- grafana
manifests:
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/secrets.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/grafana-dashboards.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/grafana.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/_templates/postgresql-timescaledb.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/nifi-kafka-druid-superset-s3/zookeeper.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/nifi.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/serviceaccount.yaml
- - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/jupyterhub-pyspark-hdfs/spark_driver_service.yaml
- - helmChart: https://raw.githubusercontent.com/stackabletech/demos/main/stacks/signal-processing/jupyterhub.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/secrets.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/grafana-dashboards.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/grafana.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/_templates/postgresql-timescaledb.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/nifi-kafka-druid-superset-s3/zookeeper.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/nifi.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/serviceaccount.yaml
+ - plainYaml: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/jupyterhub-pyspark-hdfs/spark_driver_service.yaml
+ - helmChart: https://raw.githubusercontent.com/stackabletech/demos/release-1.23/stacks/signal-processing/jupyterhub.yaml
parameters:
- name: nifiAdminPassword
description: Password of the NiFI admin user
|
This was referenced Nov 18, 2024
Techassi
approved these changes
Nov 18, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Part of stackabletech/stackable-utils#84.
To be used by the stackable-utils/release script.
See example diffs in comments below (when run from a fake release branch)...