Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update refs script #132

Merged
merged 13 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitattributes
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ docs/** -linguist-documentation

*.adoc linguist-detectable
*.yaml linguist-detectable
*.yml linguist-detectable
*.yml linguist-detectable
4 changes: 4 additions & 0 deletions .github/workflows/dev_jupyter-pyspark-with-alibi-detect.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ name: Build and publish jupyter-pyspark-with-alibi-detect

env:
IMAGE_NAME: jupyter-pyspark-with-alibi-detect
# TODO (@NickLarsenNZ): Use a versioned image with stackable0.0.0-dev or stackableXX.X.X so that
# the demo is reproducable for the release and it will be automatically replaced for the release branch.
IMAGE_VERSION: python-3.9
REGISTRY_PATH: stackable
DOCKERFILE_PATH: "demos/signal-processing/Dockerfile-jupyter"
Expand All @@ -12,6 +14,8 @@ on:
push:
branches:
- main
# TODO (@NickLarsenNZ): Also build on release branches, but with a stackable0.0.0-dev or stackableXX.X.X tag.
# - release-*
paths:
- demos/signal-processing/Dockerfile-jupyter
- demos/signal-processing/requirements.txt
Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/dev_nifi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ name: Build and publish NiFi for signal-processing demo

env:
IMAGE_NAME: nifi
# TODO (@NickLarsenNZ): Use a versioned image with stackable0.0.0-dev or stackableXX.X.X so that
# the demo is reproducable for the release and it will be automatically replaced for the release branch.
IMAGE_VERSION: 1.27.0-postgresql
REGISTRY_PATH: stackable
DOCKERFILE_PATH: "demos/signal-processing/Dockerfile-nifi"
Expand All @@ -12,6 +14,8 @@ on:
push:
branches:
- main
# TODO (@NickLarsenNZ): Also build on release branches, but with a stackable0.0.0-dev or stackableXX.X.X tag.
# - release-*
paths:
- demos/signal-processing/Dockerfile-nifi
- .github/workflows/dev_nifi.yaml
Expand Down
6 changes: 5 additions & 1 deletion .github/workflows/dev_spark-k8s-with-scikit-learn.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ name: Build and publish spark-k8s-with-scikit-learn

env:
IMAGE_NAME: spark-k8s-with-scikit-learn
# TODO (@NickLarsenNZ): Use a versioned image with stackable0.0.0-dev or stackableXX.X.X so that
# the demo is reproducable for the release and it will be automatically replaced for the release branch.
IMAGE_VERSION: 3.5.0-stackable24.3.0
REGISTRY_PATH: stackable
DOCKERFILE_PATH: "demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/Dockerfile"
Expand All @@ -12,6 +14,8 @@ on:
push:
branches:
- main
# TODO (@NickLarsenNZ): Also build on release branches, but with a stackable0.0.0-dev or stackableXX.X.X tag.
# - release-*
paths:
- demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/Dockerfile
- demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data/requirements.txt
Expand All @@ -30,7 +34,7 @@ jobs:
# TODO: the image 3.5.0-stackable24.3.0 does not have an arm64 build.
# Re-activate the arm runner when the image is updated to one that does.
# Also adjust publish_manifest step to include arm architecture
#- {name: "ubicloud-standard-8-arm", arch: "arm64"}
# - {name: "ubicloud-standard-8-arm", arch: "arm64"}
steps:
- name: Checkout Repository
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
Expand Down
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
.env
.env
.envrc
.direnv/
93 changes: 93 additions & 0 deletions .scripts/update_refs.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
#!/usr/bin/env bash
set -euo pipefail

# This script is used by the stackable-utils release script to update the demos
# repository branch references as well as the stackableRelease versions so that
# demos are properly versioned.

# Parse args:
# $1 if `commit` is specified as the first argument, then changes will be staged and committed.
COMMIT="${1:-false}"
COMMIT="${COMMIT/commit/true}"

CURRENT_BRANCH=$(git rev-parse --abbrev-ref HEAD)

# Ensure we are not on the `main` branch.
if [[ "$CURRENT_BRANCH" == "main" ]]; then
>&2 echo "Will not replace github references for the main branch. Exiting."
exit 1
fi

# Ensure the index is clean
if ! git diff-index --quiet HEAD --; then
>&2 echo "Dirty git index. Check working tree or staged changes. Exiting."
exit 2
fi

# prepend a string to each line of stdout
function prepend {
while read -r line; do
echo -e "${1}${line}"
done
}

# stage and commit based on a message
function maybe_commit {
[ "$COMMIT" == "true" ] || return 0
local MESSAGE="$1"
PATCH=$(mktemp)
git add -u
git diff --staged > "$PATCH"
git commit -S -m "$MESSAGE" --no-verify
echo "patch written to: $PATCH" | prepend "\t"
}

if [[ "$CURRENT_BRANCH" == release-* ]]; then
STACKABLE_RELEASE="${CURRENT_BRANCH#release-}"
MESSAGE="Update stackableRelease to $STACKABLE_RELEASE"
echo "$MESSAGE"
# NOTE (@NickLarsenNZ): find is not required for such a trivial case, but it is done for consitency
find stacks/stacks-v2.yaml \
-exec grep --color=always -l stackableRelease {} \; \
-exec sed -i -E "s|(stackableRelease:\s+)(\S+)|\1${STACKABLE_RELEASE}|" {} \; \
| prepend "\t"
maybe_commit "chore(release): $MESSAGE"

# Replace 0.0.0-dev refs with ${STACKABLE_RELEASE}.0
# TODO (@NickLarsenNZ): handle patches later, and what about release-candidates?
SEARCH='stackable(0\.0\.0-dev|24\.7\.[0-9]+)' # TODO (@NickLarsenNZ): After https://github.com/stackabletech/stackable-cockpit/issues/310, only search for 0.0.0-dev
REPLACEMENT="stackable${STACKABLE_RELEASE}.0" # TODO (@NickLarsenNZ): Be a bit smarter about patch releases.
MESSAGE="Update image references with $REPLACEMENT"
echo "$MESSAGE"
find demos stacks -type f \
-exec grep --color=always -lE "$SEARCH" {} \; \
-exec sed -i -E "s|${SEARCH}|${REPLACEMENT}|" {} \; \
| prepend "\t"
maybe_commit "chore(release): $MESSAGE"

# Look for remaining references
echo "Checking files with older stackable release references which will be assumed to be intentional."
grep --color=always -ronE "stackable24\.3(\.[0-9]+)" | prepend "\t"
echo
else
>&2 echo "WARNING: doesn't look like a release branch. Will not update stackableRelease versions in stacks and image references."
fi

MESSAGE="Replace githubusercontent references main->${CURRENT_BRANCH}"
echo "$MESSAGE"
# Search for githubusercontent urls and replace the branch reference with a placeholder variable
# This is done just in case the branch has special regex characters (like `/`).
# shellcheck disable=SC2016 # We intentionally don't want to expand the variable.
find demos stacks -type f \
-exec grep --color=always -l githubusercontent {} \; \
-exec sed -i -E 's|(stackabletech/demos)/main/|\1/\${UPDATE_BRANCH_REF}/|' {} \; \
| prepend "\t"

# Now, for all modified files, we can use envsubst
export UPDATE_BRANCH_REF="$CURRENT_BRANCH"
for MODIFIED_FILE in $(git diff --name-only); do
# shellcheck disable=SC2016 # We intentionally don't want to expand the variable.
envsubst '$UPDATE_BRANCH_REF' < "$MODIFIED_FILE" > "$MODIFIED_FILE.replacements"
mv "$MODIFIED_FILE.replacements" "$MODIFIED_FILE"
done
maybe_commit "chore(release): $MESSAGE"
1 change: 1 addition & 0 deletions .yamllint.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@ rules:
comments:
min-spaces-from-content: 1 # Needed due to https://github.com/adrienverge/yamllint/issues/443
braces: disable # because the yaml files are templates which can have {{ ... }}
indentation: disable # There are many conflicting styles and it isn't so important in this repo. It can be enabled later if we want consistency.
45 changes: 0 additions & 45 deletions demos/data-lakehouse-iceberg-trino-spark/create-trino-tables.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -338,29 +338,6 @@ data:
)
""")
























run_query(connection, """
create table if not exists lakehouse.house_sales.house_sales with (
partitioning = ARRAY['year(date_of_transfer)']
Expand Down Expand Up @@ -504,23 +481,6 @@ data:
where tpep_pickup_datetime >= date '2015-01-01' and tpep_pickup_datetime <= now() -- We have to remove some invalid records
""")


















run_query(connection, """
create or replace materialized view lakehouse.taxi.yellow_tripdata_daily_agg as
select
Expand Down Expand Up @@ -566,11 +526,6 @@ data:
REFRESH MATERIALIZED VIEW lakehouse.taxi.yellow_tripdata_monthly_agg
""")






# At this point Spark should have created the needed underlying tables
run_query(connection, """
create or replace view lakehouse.smart_city.shared_bikes_station_status_latest as
Expand Down
1 change: 1 addition & 0 deletions demos/demos-v1.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
---
demos:
please-update:
description: This version of stackablectl is outdated, please visit https://docs.stackable.tech/stackablectl/stable/installation.html on how to get the latest version
Expand Down
2 changes: 1 addition & 1 deletion demos/demos-v2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ demos:
memory: 42034Mi
pvc: 75Gi # 30Gi for Kafka
nifi-kafka-druid-water-level-data:
description: Demo ingesting water level data into Kafka using NiFi, streaming it into Druid and creating a Superset dashboard
description: Demo ingesting water level data into Kafka using NiFi, streaming it into Druid and creating a Superset dashboard
documentation: https://docs.stackable.tech/stackablectl/stable/demos/nifi-kafka-druid-water-level-data.html
stackableStack: nifi-kafka-druid-superset-s3
labels:
Expand Down
1 change: 1 addition & 0 deletions demos/end-to-end-security/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
3. Optional: Add Database connection
4. Add admin user in Keycloak to all relevant groups (so that he has access to the tables, so he can create datasets, charts and dashboards).
5. `pgdump` the Postgres and update the dump in Git. For that shell into `postgresql-superset-0` and execute

```sh
export PGPASSWORD="$POSTGRES_POSTGRES_PASSWORD"

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
#!/usr/bin/env bash
set -euo pipefail

# This script is not used for the demo
# Its purpose is to document how to retrieve the used earthquake data

Expand Down
4 changes: 3 additions & 1 deletion docs/modules/demos/images/end-to-end-security/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# ent-to-end-security

The images are exported from
https://docs.google.com/presentation/d/19h3sBve_dOSgpZ6eTZqmYXxGoiQqXNs1/edit?usp=sharing&ouid=105504333647320477456&rtpof=true&sd=true.
<https://docs.google.com/presentation/d/19h3sBve_dOSgpZ6eTZqmYXxGoiQqXNs1/edit?usp=sharing&ouid=105504333647320477456&rtpof=true&sd=true>
Ask Sebastian for access if needed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -151,6 +151,10 @@ This is described below.

Libraries can be added to a custom *product* image launched by the notebook. Suppose a Spark job is prepared like this:

// TODO (@NickLarsenNZ): Use stackable0.0.0-dev so that the demo is reproducable for the release
// and it will be automatically replaced for the release branch.
// Also update the reference in notebook.ipynb.

[source,python]
----
spark = (SparkSession
Expand All @@ -172,6 +176,10 @@ spark = (SparkSession

It requires a specific Spark image:

// TODO (@NickLarsenNZ): Use stackable0.0.0-dev so that the demo is reproducable for the release
// and it will be automatically replaced for the release branch.
// Also update the reference in notebook.ipynb.

[source,python]
----
.config("spark.kubernetes.container.image",
Expand Down
6 changes: 6 additions & 0 deletions shell.nix
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{ pkgs ? import <nixpkgs> { } }:
pkgs.mkShell {
packages = with pkgs; [
gettext # envsubst
];
}
16 changes: 8 additions & 8 deletions stacks/_templates/keycloak.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -126,14 +126,14 @@ metadata:
labels:
app: keycloak
spec:
# We want a stable Keycloak address that does not change when Keycloak reboots.
# We could simply pick LoadBalancer here, but on-prem clusters often times don't support LBs,
# so the demo would not run on their environments. Additionally, LB addresses often take a while to allocate
# (order of minutes on GCP iirc). So there's no way for us to know whether there's no LB address because it's still
# in progress, or if there's no LB address because it's unsupported.
#
# But we can at least make sure to reconcile the AuthClass once Keycloak restarts.
# We achieve this by letting the keycloak itself propagate it's address instead of a separate Job.
# We want a stable Keycloak address that does not change when Keycloak reboots.
# We could simply pick LoadBalancer here, but on-prem clusters often times don't support LBs,
# so the demo would not run on their environments. Additionally, LB addresses often take a while to allocate
# (order of minutes on GCP iirc). So there's no way for us to know whether there's no LB address because it's still
# in progress, or if there's no LB address because it's unsupported.
#
# But we can at least make sure to reconcile the AuthClass once Keycloak restarts.
# We achieve this by letting the keycloak itself propagate it's address instead of a separate Job.
type: NodePort
selector:
app: keycloak
Expand Down
1 change: 1 addition & 0 deletions stacks/_templates/prometheus-service-monitor.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
---
apiVersion: monitoring.coreos.com/v1
kind: ServiceMonitor
metadata:
Expand Down
1 change: 1 addition & 0 deletions stacks/_templates/vector-aggregator-discovery.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
---
apiVersion: v1
kind: ConfigMap
metadata:
Expand Down
2 changes: 1 addition & 1 deletion stacks/authentication/openldap-tls.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -93,4 +93,4 @@ spec:
port: 1636
targetPort: tls-ldap
selector:
app.kubernetes.io/name: openldap
app.kubernetes.io/name: openldap
1 change: 0 additions & 1 deletion stacks/end-to-end-security/kerberos-secretclass.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@

---
apiVersion: secrets.stackable.tech/v1alpha1
kind: SecretClass
Expand Down
2 changes: 1 addition & 1 deletion stacks/end-to-end-security/superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ spec:
spec:
# We need to restore the postgres state before the superset container itself starts some database migrations
initContainers:
# The postgres image does not contain curl or wget...
# The postgres image does not contain curl or wget...
- name: download-dump
image: docker.stackable.tech/stackable/testing-tools:0.2.0-stackable24.7.0
command:
Expand Down
2 changes: 1 addition & 1 deletion stacks/keycloak-opa-poc/policies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -58,4 +58,4 @@ data:
# "57d3b407-ecc0-4cc1-aaaf-45a63f43b96b",
# "170b4130-ca4d-417b-b229-f2917d5ab3d1"
# ]
# }
# }
1 change: 1 addition & 0 deletions stacks/keycloak-opa-poc/setup-keycloak.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
---
apiVersion: v1
kind: Secret
metadata:
Expand Down
1 change: 1 addition & 0 deletions stacks/observability/grafana-admin-credentials.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
---
apiVersion: v1
kind: Secret
metadata:
Expand Down
3 changes: 2 additions & 1 deletion stacks/observability/grafana.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# https://github.com/grafana/helm-charts/tree/main/charts/grafana
# yamllint disable rule:comments-indentation
---
# https://github.com/grafana/helm-charts/tree/main/charts/grafana
releaseName: grafana
name: grafana
repo:
Expand Down
Loading