Skip to content

Commit

Permalink
chore: bump nifi for 24.11
Browse files Browse the repository at this point in the history
  • Loading branch information
razvan committed Sep 18, 2024
1 parent 62516a1 commit d5b4b25
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 9 deletions.
2 changes: 1 addition & 1 deletion demos/signal-processing/Dockerfile-nifi
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# docker build -f ./Dockerfile-nifi -t docker.stackable.tech/demos/nifi:1.27.0-postgresql .

FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.7.0
FROM docker.stackable.tech/stackable/nifi:1.27.0-stackable24.11.0

RUN curl --fail -o /stackable/nifi/postgresql-42.6.0.jar "https://repo.stackable.tech/repository/misc/postgresql-timescaledb/postgresql-42.6.0.jar"
12 changes: 6 additions & 6 deletions demos/signal-processing/DownloadAndWriteToDB.xml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
<bundle>
<artifact>nifi-dbcp-service-nar</artifact>
<group>org.apache.nifi</group>
<version>1.21.0</version>
<version>1.27.0</version>
</bundle>
<comments></comments>
<descriptors>
Expand Down Expand Up @@ -258,7 +258,7 @@
<bundle>
<artifact>nifi-record-serialization-services-nar</artifact>
<group>org.apache.nifi</group>
<version>1.21.0</version>
<version>1.27.0</version>
</bundle>
<comments></comments>
<descriptors>
Expand Down Expand Up @@ -561,7 +561,7 @@
</position>
<height>88.0</height>
<label>This flow downloads a dataset, writing it to a temporary table in TimescaleDB.
This data is then written to the target table with the time offsets preserved,
This data is then written to the target table with the time offsets preserved,
but re-based to the current time. This means that the data can be displayed
in Grafana as if it were being streamed, whereas in fact the dashboard moves
through "future" data that has already been persisted.</label>
Expand All @@ -584,7 +584,7 @@ through "future" data that has already been persisted.</label>
<bundle>
<artifact>nifi-standard-nar</artifact>
<group>org.apache.nifi</group>
<version>1.21.0</version>
<version>1.27.0</version>
</bundle>
<config>
<backoffMechanism>PENALIZE_FLOWFILE</backoffMechanism>
Expand Down Expand Up @@ -1069,7 +1069,7 @@ through "future" data that has already been persisted.</label>
<bundle>
<artifact>nifi-standard-nar</artifact>
<group>org.apache.nifi</group>
<version>1.21.0</version>
<version>1.27.0</version>
</bundle>
<config>
<backoffMechanism>PENALIZE_FLOWFILE</backoffMechanism>
Expand Down Expand Up @@ -1223,7 +1223,7 @@ from conditions_temp;</value>
<bundle>
<artifact>nifi-standard-nar</artifact>
<group>org.apache.nifi</group>
<version>1.21.0</version>
<version>1.27.0</version>
</bundle>
<config>
<backoffMechanism>PENALIZE_FLOWFILE</backoffMechanism>
Expand Down
4 changes: 2 additions & 2 deletions stacks/data-lakehouse-iceberg-trino-spark/nifi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: nifi
spec:
image:
productVersion: 1.25.0
productVersion: 1.27.0
clusterConfig:
authentication:
- authenticationClass: nifi-admin-credentials
Expand Down Expand Up @@ -52,7 +52,7 @@ kind: Secret
metadata:
name: nifi-admin-credentials-secret
stringData:
admin: {{ nifiAdminPassword }}
admin: {{nifiAdminPassword}}
---
apiVersion: zookeeper.stackable.tech/v1alpha1
kind: ZookeeperZnode
Expand Down

0 comments on commit d5b4b25

Please sign in to comment.