diff --git a/README.adoc b/README.adoc
index 852bc0b93..3649986e3 100644
--- a/README.adoc
+++ b/README.adoc
@@ -2,114 +2,20 @@
In this repository, you will find a collection of components that can meet various data integration use cases and requirements.
-The repository's primary focus is to provide a set of standalone Java functions that can be useful in the end-user applications as-is.
-
-Besides, this repository builds on the Java functions to generate standalone Spring Cloud Stream applications that can run against Spring Cloud Stream's RabbitMQ or Apache Kafka binder implementations.
+This repository builds on top of the https://github.com/spring-cloud/spring-functions-catalog[Spring Functions Catalog] and generates standalone Spring Cloud Stream applications that can run against Spring Cloud Stream's RabbitMQ or Apache Kafka binder implementations.
It is also possible to extend the generator to bundle the Java functions with the other supported binder implementations.
These applications can run standalone or as part of a data flow, such as the one orchestrated using Spring Cloud Data Flow.
=== Project Structure
-The repository includes two major sections - `Functions` and `Applications`.
-The former hosts the various Java functions, and the latter is for generating the standalone Spring Cloud Stream applications and hosting their related components.
-
The following are the various components of this repository.
-* https://github.com/spring-cloud/stream-applications/tree/master/functions[Standalone Java Functions]
* https://github.com/spring-cloud/stream-applications/tree/master/applications/stream-applications-core[Common Core Components for Applications]
* https://github.com/spring-cloud/stream-applications/tree/master/applications[Spring Cloud Stream Applications]
* https://github.com/spring-cloud/stream-applications/tree/master/stream-applications-build[Build Parent]
* https://github.com/spring-cloud/stream-applications/tree/master/stream-applications-release-train[Release Train]
-=== Reusable Functions
-
-|===
-| `java.util.Supplier` | `java.util.Function` | `java.util.Consumer`
-
-|link:functions/supplier/debezium-supplier/README.adoc[Debezium supplier]
-|link:functions/function/aggregator-function/README.adoc[Aggregator]
-|link:functions/consumer/analytics-consumer/README.adoc[Analytics]
-
-|link:functions/supplier/file-supplier/README.adoc[File]
-|link:functions/function/filter-function/README.adoc[Filter]
-|link:functions/consumer/cassandra-consumer/README.adoc[Cassandra]
-
-|link:functions/supplier/ftp-supplier/README.adoc[FTP]
-|link:functions/function/header-enricher-function/README.adoc[Header-Enricher]
-|link:functions/consumer/elasticsearch-consumer/README.adoc[Elasticsearch]
-
-|
-|link:functions/function/header-filter-function/README.adoc[Header-Filter]
-|link:functions/consumer/file-consumer/README.adoc[File]
-
-|link:functions/supplier/http-supplier/README.adoc[HTTP]
-|link:functions/function/http-request-function/README.adoc[HTTP Request]
-|link:functions/consumer/ftp-consumer/README.adoc[FTP]
-
-|link:functions/supplier/jdbc-supplier/README.adoc[JDBC]
-|link:functions/function/image-recognition-function/README.adoc[Image Recognition(Tensorflow)]
-|
-
-|link:functions/supplier/jms-supplier/README.adoc[JMS]
-|link:functions/function/object-detection-function/README.adoc[Object Detection(Tensorflow)]
-|link:functions/consumer/jdbc-consumer/README.adoc[JDBC]
-
-|link:functions/supplier/mail-supplier/README.adoc[Mail]
-|link:functions/function/semantic-segmentation-function/README.adoc[Semantic Segmentation(Tensorflow)]
-|link:functions/consumer/log-consumer/README.adoc[Log]
-
-|link:functions/supplier/mongodb-supplier/README.adoc[MongoDB]
-|link:functions/function/spel-function/README.adoc[SpEL]
-|link:functions/consumer/mongodb-consumer/README.adoc[MongoDB]
-
-|link:functions/supplier/mqtt-supplier/README.adoc[MQTT]
-
-|link:functions/function/splitter-function/README.adoc[Splitter]
-|link:functions/consumer/mqtt-consumer/README.adoc[MQTT]
-
-|link:functions/supplier/rabbit-supplier/README.adoc[RabbitMQ]
-|link:functions/function/task-launch-request-function/README.adoc[Task Launch Request]
-|link:functions/consumer/rabbit-consumer/README.adoc[RabbitMQ]
-
-|link:functions/supplier/s3-supplier/README.adoc[AWS S3]
-|
-|link:functions/consumer/redis-consumer/README.adoc[Redis]
-
-|link:functions/supplier/sftp-supplier/README.adoc[SFTP]
-|
-|link:functions/consumer/rsocket-consumer/README.adoc[RSocket]
-
-|link:functions/supplier/syslog-supplier/README.adoc[Syslog]
-|
-|link:functions/consumer/s3-consumer/README.adoc[AWS S3]
-
-|link:functions/supplier/tcp-supplier/README.adoc[TCP]
-|
-|link:functions/consumer/sftp-consumer/README.adoc[SFTP]
-
-|link:functions/supplier/time-supplier/README.adoc[Time]
-|
-|link:functions/consumer/tcp-consumer/README.adoc[TCP]
-
-|link:functions/supplier/twitter-supplier/README.adoc[Twitter]
-|link:functions/function/twitter-function/README.adoc[Twitter]
-|link:functions/consumer/twitter-consumer/README.adoc[Twitter]
-
-|link:functions/supplier/websocket-supplier/README.adoc[Websocket]
-|
-|link:functions/consumer/websocket-consumer/README.adoc[Websocket]
-
-|
-|
-|link:functions/consumer/wavefront-consumer/README.adoc[Wavefront]
-
-|link:functions/supplier/xmpp-supplier/README.adoc[XMPP]
-|
-|link:functions/consumer/xmpp-consumer/README.adoc[XMPP]
-
-|===
-
=== Reusable Spring Cloud Stream Applications
|===
@@ -139,11 +45,11 @@ The following are the various components of this repository.
|
|link:applications/source/jms-source/README.adoc[JMS]
-|link:applications/processor/image-recognition-processor/README.adoc[Image Recognition(Tensorflow)]
+|
|link:applications/sink/jdbc-sink/README.adoc[JDBC]
|link:applications/source/load-generator-source/README.adoc[Load-Generator]
-|link:applications/processor/object-detection-processor/README.adoc[Object Detection(Tensorflow)]
+|
|link:applications/sink/log-sink/README.adoc[Log]
|link:applications/source/mail-source/README.adoc[Mail]
@@ -151,7 +57,7 @@ The following are the various components of this repository.
|link:applications/sink/mongodb-sink/README.adoc[MongoDB]
|link:applications/source/mongodb-source/README.adoc[MongoDB]
-|link:applications/processor/semantic-segmentation-processor/README.adoc[Semantic Segmentation(Tensorflow)]
+|
|link:applications/sink/mqtt-sink/README.adoc[MQTT]
|link:applications/source/mqtt-source/README.adoc[MQTT]
@@ -180,7 +86,7 @@ The following are the various components of this repository.
|link:applications/source/time-source/README.adoc[Time]
|
-|link:applications/sink/tasklauncher-sink/README.adoc[Task Launcher]
+|
|link:applications/source/twitter-message-source/README.adoc[Twitter Message]
|
@@ -253,7 +159,8 @@ You can then build the desired apps.
./build-app.sh . applications/sink/log-sink
....
-NOTE: In order to disable metrics by default there needs to be application properties configured like in `default-application.properties`. The `build-app.sh` script will copy default-application.properties into src/main/resources if no application.properties,yml,yaml or json is present.
+NOTE: In order to disable metrics by default there needs to be application properties configured like in `default-application.properties`.
+The `build-app.sh` script will copy default-application.properties into src/main/resources if no application.properties,yml,yaml or json is present.
=== Additional Resources
diff --git a/applications/processor/aggregator-processor/pom.xml b/applications/processor/aggregator-processor/pom.xml
index 925103deb..e98704e84 100644
--- a/applications/processor/aggregator-processor/pom.xml
+++ b/applications/processor/aggregator-processor/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- aggregator-function
+ spring-aggregator-function
@@ -35,13 +35,13 @@
aggregator
processor
${project.version}
- org.springframework.cloud.fn.aggregator.AggregatorFunctionConfiguration.class
+ AUTOCONFIGURATION
jsonBytesToMap|aggregatorFunction
org.springframework.cloud.fn
- aggregator-function
+ spring-aggregator-function
diff --git a/applications/processor/filter-processor/pom.xml b/applications/processor/filter-processor/pom.xml
index 1b2f620c1..53795c9a6 100644
--- a/applications/processor/filter-processor/pom.xml
+++ b/applications/processor/filter-processor/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- filter-function
+ spring-filter-function
@@ -43,13 +43,13 @@
filter
processor
${project.version}
- org.springframework.cloud.fn.filter.FilterFunctionConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|filterFunction
org.springframework.cloud.fn
- filter-function
+ spring-filter-function
diff --git a/applications/processor/groovy-processor/pom.xml b/applications/processor/groovy-processor/pom.xml
index c8c72fea2..dd37ad156 100644
--- a/applications/processor/groovy-processor/pom.xml
+++ b/applications/processor/groovy-processor/pom.xml
@@ -54,7 +54,7 @@
org.springframework.cloud.fn
- payload-converter-function
+ spring-payload-converter-function
org.springframework.boot
diff --git a/applications/processor/header-enricher-processor/pom.xml b/applications/processor/header-enricher-processor/pom.xml
index 77e34ba33..63b21db10 100644
--- a/applications/processor/header-enricher-processor/pom.xml
+++ b/applications/processor/header-enricher-processor/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- header-enricher-function
+ spring-header-enricher-function
org.springframework.integration
@@ -48,13 +48,13 @@
header-enricher
processor
${project.version}
- org.springframework.cloud.fn.header.enricher.HeaderEnricherFunctionConfiguration.class
+ AUTOCONFIGURATION
headerEnricherFunction
org.springframework.cloud.fn
- header-enricher-function
+ spring-header-enricher-function
diff --git a/applications/processor/header-filter-processor/pom.xml b/applications/processor/header-filter-processor/pom.xml
index d36d4cd89..3a5d5e275 100644
--- a/applications/processor/header-filter-processor/pom.xml
+++ b/applications/processor/header-filter-processor/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- header-filter-function
+ spring-header-filter-function
org.springframework.integration
@@ -53,13 +53,13 @@
header-filter
processor
${project.version}
- org.springframework.cloud.fn.header.filter.HeaderFilterFunctionConfiguration.class
+ AUTOCONFIGURATION
headerFilterFunction
org.springframework.cloud.fn
- header-filter-function
+ spring-header-filter-function
diff --git a/applications/processor/http-request-processor/README.adoc b/applications/processor/http-request-processor/README.adoc
index 44a8b4779..338e6b70c 100644
--- a/applications/processor/http-request-processor/README.adoc
+++ b/applications/processor/http-request-processor/README.adoc
@@ -52,7 +52,7 @@ $$body-expression$$:: $$A SpEL expression to derive the request body from the in
$$expected-response-type$$:: $$The type used to interpret the response.$$ *($$Class>$$, default: `$$$$`)*
$$headers-expression$$:: $$A SpEL expression used to derive the http headers map to use.$$ *($$Expression$$, default: `$$$$`)*
$$http-method-expression$$:: $$A SpEL expression to derive the request method from the incoming message.$$ *($$Expression$$, default: `$$$$`)*
-$$reply-expression$$:: $$A SpEL expression used to compute the final result, applied against the whole http {@link org.springframework.http.ResponseEntity}.$$ *($$Expression$$, default: `$$$$`)*
+$$reply-expression$$:: $$A SpEL expression used to compute the final result, applied against the whole http {@link ResponseEntity}.$$ *($$Expression$$, default: `$$$$`)*
$$timeout$$:: $$Request timeout in milliseconds.$$ *($$Long$$, default: `$$30000$$`)*
$$url-expression$$:: $$A SpEL expression against incoming message to determine the URL to use.$$ *($$Expression$$, default: `$$$$`)*
diff --git a/applications/processor/http-request-processor/pom.xml b/applications/processor/http-request-processor/pom.xml
index bf8dfc783..9eddc9b9f 100644
--- a/applications/processor/http-request-processor/pom.xml
+++ b/applications/processor/http-request-processor/pom.xml
@@ -21,7 +21,7 @@
org.springframework.cloud.fn
- http-request-function
+ spring-http-request-function
com.squareup.okhttp3
@@ -56,15 +56,14 @@
http-request
processor
${project.version}
- org.springframework.cloud.fn.http.request.HttpRequestFunctionConfiguration.class
-
+ AUTOCONFIGURATION
httpRequestFunction
org.springframework.cloud.fn
- http-request-function
+ spring-http-request-function
diff --git a/applications/processor/image-recognition-processor/README.adoc b/applications/processor/image-recognition-processor/README.adoc
deleted file mode 100644
index 84377609c..000000000
--- a/applications/processor/image-recognition-processor/README.adoc
+++ /dev/null
@@ -1,56 +0,0 @@
-//tag::ref-doc[]
-:image-root: https://raw.githubusercontent.com/spring-cloud-stream-app-starters/tensorflow/master/images
-
-= Image Recognition Processor
-
-A processor that uses an https://github.com/tensorflow/models/tree/master/inception[Inception model] to classify
-in real-time images into different categories (e.g. labels).
-
-Model implements a deep https://en.wikipedia.org/wiki/Convolutional_neural_network[Convolutional Neural Network] that can achieve reasonable performance on hard visual recognition tasks
-- matching or exceeding human performance in some domains like https://www.tensorflow.org/tutorials/image_recognition[image recognition].
-
-The input of the model is an image as binary array.
-
-The output is a JSON message in this format:
-
-[source,json]
-....
-{
- "labels" : [
- {"giant panda":0.98649305}
- ]
-}
-....
-
-Result contains the name of the recognized category (e.g. label) along with the confidence (e.g. confidence) that the image represents this category.
-
-If the `response-seize` is set to value higher then 1, then the result will include the top `response-seize` probable labels. For example `response-size=3` would return:
-
-[source,json]
-....
-{
- "labels": [
- {"giant panda":0.98649305},
- {"badger":0.010562794},
- {"ice bear":0.001130851}
- ]
-}
-....
-
-== Payload
-
-If the incoming type is `byte[]` and the content type is set to `application/octet-stream` , then the application process the input `byte[]` image into and outputs augmented `byte[]` image payload and json header.
-
-== Options
-
-//tag::configuration-properties[]
-$$image.recognition.cache-model$$:: $$cache the pre-trained tensorflow model.$$ *($$Boolean$$, default: `$$$$`)*
-$$image.recognition.debug-output$$:: $$$$ *($$Boolean$$, default: `$$$$`)*
-$$image.recognition.debug-output-path$$:: $$$$ *($$String$$, default: `$$$$`)*
-$$image.recognition.model$$:: $$pre-trained tensorflow image recognition model. Note that the model must match the selected model type!$$ *($$String$$, default: `$$$$`)*
-$$image.recognition.model-type$$:: $$Supports three different pre-trained tensorflow image recognition models: Inception, MobileNetV1 and MobileNetV2 1. Inception graph uses 'input' as input and 'output' as output. 2. MobileNetV2 pre-trained models: https://github.com/tensorflow/models/tree/master/research/slim/nets/mobilenet#pretrained-models - normalized image size is always square (e.g. H=W) - graph uses 'input' as input and 'MobilenetV2/Predictions/Reshape_1' as output. 3. MobileNetV1 pre-trained models: https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md#pre-trained-models - graph uses 'input' as input and 'MobilenetV1/Predictions/Reshape_1' as output.$$ *($$ModelType$$, default: `$$$$`, possible values: `inception`,`mobilenetv1`,`mobilenetv2`)*
-$$image.recognition.normalized-image-size$$:: $$Normalized image size.$$ *($$Integer$$, default: `$$$$`)*
-$$image.recognition.response-size$$:: $$number of recognized images.$$ *($$Integer$$, default: `$$$$`)*
-//end::configuration-properties[]
-
-//end::ref-doc[]
diff --git a/applications/processor/image-recognition-processor/pom.xml b/applications/processor/image-recognition-processor/pom.xml
deleted file mode 100644
index ca4581057..000000000
--- a/applications/processor/image-recognition-processor/pom.xml
+++ /dev/null
@@ -1,69 +0,0 @@
-
-
- 4.0.0
- image-recognition-processor
- image-recognition-processor
- Image recognition (tensorflow) processor apps
- jar
-
-
- org.springframework.cloud.stream.app
- stream-applications-core
- 5.0.0-SNAPSHOT
- ../../stream-applications-core/pom.xml
-
-
-
-
- org.springframework.boot
- spring-boot-configuration-processor
-
-
- org.springframework.cloud.fn
- image-recognition-function
-
-
-
-
-
-
-
- org.apache.maven.plugins
- maven-deploy-plugin
- 3.0.0
-
- false
-
-
-
- org.springframework.cloud
- spring-cloud-dataflow-apps-docs-plugin
-
-
- org.springframework.cloud
- spring-cloud-dataflow-apps-generator-plugin
-
-
- image-recognition
- processor
- ${project.version}
- org.springframework.cloud.stream.app.processor.image.recognition.ImageRecognitionProcessorConfiguration.class
- imageRecognitionFunction
-
-
-
-
- org.springframework.cloud.stream.app
- image-recognition-processor
- ${project.version}
-
-
-
-
-
-
-
-
-
-
diff --git a/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorConfiguration.java b/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorConfiguration.java
deleted file mode 100644
index b6a676435..000000000
--- a/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorConfiguration.java
+++ /dev/null
@@ -1,114 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.image.recognition;
-
-import java.io.FileOutputStream;
-import java.io.IOException;
-import java.util.List;
-import java.util.function.Function;
-
-import org.apache.commons.io.IOUtils;
-import org.apache.commons.logging.Log;
-import org.apache.commons.logging.LogFactory;
-
-import org.springframework.boot.context.properties.EnableConfigurationProperties;
-import org.springframework.cloud.fn.common.tensorflow.deprecated.JsonMapperFunction;
-import org.springframework.cloud.fn.image.recognition.ImageRecognition;
-import org.springframework.cloud.fn.image.recognition.ImageRecognitionAugmenter;
-import org.springframework.cloud.fn.image.recognition.RecognitionResponse;
-import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Configuration;
-import org.springframework.integration.support.MessageBuilder;
-import org.springframework.messaging.Message;
-
-/**
- * @author Christian Tzolov
- */
-@Configuration
-@EnableConfigurationProperties(ImageRecognitionProcessorProperties.class)
-public class ImageRecognitionProcessorConfiguration {
-
- private static final Log logger = LogFactory.getLog(ImageRecognitionProcessorConfiguration.class);
-
- /**
- * Name of the Message header containing the JSON encoded recognition response.
- */
- public static final String RECOGNIZED_OBJECTS_HEADER = "recognized_objects";
-
- @Bean
- public Function, Message> imageRecognitionFunction(ImageRecognitionProcessorProperties properties) {
-
- return input -> {
- // You can use file:, http: or classpath: to provide the path to the input image.
- byte[] inputImage = input.getPayload();
-
- try (ImageRecognition imageRecognition = createImageRecognitionFunction(properties)) {
-
- List recognizedObjects =
- ImageRecognition.toRecognitionResponse(imageRecognition.recognizeTopK(inputImage));
-
- // Draw the predicted labels on top of the input image.
- byte[] augmentedImage = new ImageRecognitionAugmenter().apply(inputImage, recognizedObjects);
-
- String jsonRecognizedObjects = new JsonMapperFunction().apply(recognizedObjects);
-
- Message outMessage = MessageBuilder
- .withPayload(augmentedImage)
- .setHeader(RECOGNIZED_OBJECTS_HEADER, jsonRecognizedObjects)
- .build();
-
- if (properties.isDebugOutput()) {
- try {
- logger.info("recognized objects = " + jsonRecognizedObjects);
- IOUtils.write(augmentedImage, new FileOutputStream(properties.getDebugOutputPath()));
- }
- catch (IOException e) {
- logger.warn("Cloud not produce debug output", e);
- }
- }
-
- return outMessage;
- }
- };
- }
-
- private static ImageRecognition createImageRecognitionFunction(ImageRecognitionProcessorProperties properties) {
- switch (properties.getModelType()) {
- case inception:
- return ImageRecognition.inception(
- properties.getModel(),
- properties.getNormalizedImageSize(),
- properties.getResponseSize(),
- properties.isCacheModel());
- case mobilenetv1:
- return ImageRecognition.mobileNetV1(
- properties.getModel(),
- properties.getNormalizedImageSize(),
- properties.getResponseSize(),
- properties.isCacheModel());
- case mobilenetv2:
- return ImageRecognition.mobileNetV2(
- properties.getModel(),
- properties.getNormalizedImageSize(),
- properties.getResponseSize(),
- properties.isCacheModel());
- default:
- throw new RuntimeException("Not supported Model Type: " + properties.getModelType());
-
- }
- }
-}
diff --git a/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorProperties.java b/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorProperties.java
deleted file mode 100644
index e59e3e9e9..000000000
--- a/applications/processor/image-recognition-processor/src/main/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorProperties.java
+++ /dev/null
@@ -1,126 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.image.recognition;
-
-import org.springframework.boot.context.properties.ConfigurationProperties;
-import org.springframework.validation.annotation.Validated;
-
-/**
- * @author Christian Tzolov
- */
-@ConfigurationProperties("image.recognition")
-@Validated
-public class ImageRecognitionProcessorProperties {
-
- enum ModelType {
- inception,
- mobilenetv1,
- mobilenetv2
- }
-
- /**
- * Supports three different pre-trained tensorflow image recognition models: Inception, MobileNetV1 and MobileNetV2
- *
- * 1. Inception graph uses 'input' as input and 'output' as output.
- * 2. MobileNetV2 pre-trained models: https://github.com/tensorflow/models/tree/master/research/slim/nets/mobilenet#pretrained-models
- * - normalized image size is always square (e.g. H=W)
- * - graph uses 'input' as input and 'MobilenetV2/Predictions/Reshape_1' as output.
- * 3. MobileNetV1 pre-trained models: https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md#pre-trained-models
- * - graph uses 'input' as input and 'MobilenetV1/Predictions/Reshape_1' as output.
- */
- private ModelType modelType = ModelType.mobilenetv2;
-
- /**
- * pre-trained tensorflow image recognition model. Note that the model must match the selected model type!
- */
- private String model = "https://storage.googleapis.com/mobilenet_v2/checkpoints/mobilenet_v2_1.4_224.tgz#mobilenet_v2_1.4_224_frozen.pb";
-
- /**
- * cache the pre-trained tensorflow model.
- */
- private boolean cacheModel = true;
-
- /**
- * Normalized image size.
- */
- private int normalizedImageSize = 224;
-
- /**
- * number of recognized images.
- */
- private int responseSize = 5;
-
- private boolean debugOutput = false;
-
- private String debugOutputPath = "image-recognition-result.png";
-
- public ModelType getModelType() {
- return modelType;
- }
-
- public void setModelType(ModelType modelType) {
- this.modelType = modelType;
- }
-
- public String getModel() {
- return model;
- }
-
- public void setModel(String model) {
- this.model = model;
- }
-
- public boolean isCacheModel() {
- return cacheModel;
- }
-
- public void setCacheModel(boolean cacheModel) {
- this.cacheModel = cacheModel;
- }
-
- public int getNormalizedImageSize() {
- return normalizedImageSize;
- }
-
- public void setNormalizedImageSize(int normalizedImageSize) {
- this.normalizedImageSize = normalizedImageSize;
- }
-
- public int getResponseSize() {
- return responseSize;
- }
-
- public void setResponseSize(int responseSize) {
- this.responseSize = responseSize;
- }
-
- public boolean isDebugOutput() {
- return debugOutput;
- }
-
- public void setDebugOutput(boolean debugOutput) {
- this.debugOutput = debugOutput;
- }
-
- public String getDebugOutputPath() {
- return debugOutputPath;
- }
-
- public void setDebugOutputPath(String debugOutputPath) {
- this.debugOutputPath = debugOutputPath;
- }
-}
diff --git a/applications/processor/image-recognition-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties b/applications/processor/image-recognition-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties
deleted file mode 100644
index 4b2d310f4..000000000
--- a/applications/processor/image-recognition-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties
+++ /dev/null
@@ -1,2 +0,0 @@
-configuration-properties.classes=\
- org.springframework.cloud.stream.app.processor.image.recognition.ImageRecognitionProcessorProperties
diff --git a/applications/processor/image-recognition-processor/src/main/resources/application.yml b/applications/processor/image-recognition-processor/src/main/resources/application.yml
deleted file mode 100644
index c608c20d3..000000000
--- a/applications/processor/image-recognition-processor/src/main/resources/application.yml
+++ /dev/null
@@ -1,16 +0,0 @@
-image:
- recognition:
- model: https://storage.googleapis.com/mobilenet_v2/checkpoints/mobilenet_v2_1.4_224.tgz#mobilenet_v2_1.4_224_frozen.pb
- modelType: mobilenetv2
- responseSize: 3
- normalizedImageSize: 224
- cacheModel: true
-spring:
- cloud:
- function:
- definition: imageRecognitionFunction
-management:
- defaults:
- metrics:
- export:
- enabled: false
\ No newline at end of file
diff --git a/applications/processor/image-recognition-processor/src/test/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorTests.java b/applications/processor/image-recognition-processor/src/test/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorTests.java
deleted file mode 100644
index 5695ed117..000000000
--- a/applications/processor/image-recognition-processor/src/test/java/org/springframework/cloud/stream/app/processor/image/recognition/ImageRecognitionProcessorTests.java
+++ /dev/null
@@ -1,190 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.image.recognition;
-
-import java.io.IOException;
-import java.math.BigDecimal;
-import java.math.RoundingMode;
-import java.util.ArrayList;
-import java.util.List;
-import java.util.Map;
-import java.util.function.Consumer;
-
-import com.fasterxml.jackson.core.JsonProcessingException;
-import com.fasterxml.jackson.databind.ObjectMapper;
-import org.junit.jupiter.api.Test;
-
-import org.springframework.boot.WebApplicationType;
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-import org.springframework.boot.builder.SpringApplicationBuilder;
-import org.springframework.cloud.fn.common.tensorflow.deprecated.GraphicsUtils;
-import org.springframework.cloud.stream.binder.test.InputDestination;
-import org.springframework.cloud.stream.binder.test.OutputDestination;
-import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
-import org.springframework.context.ConfigurableApplicationContext;
-import org.springframework.context.annotation.Import;
-import org.springframework.messaging.Message;
-import org.springframework.messaging.support.GenericMessage;
-
-import static org.assertj.core.api.Assertions.assertThat;
-
-/**
- * @author Christian Tzolov
- */
-public class ImageRecognitionProcessorTests {
-
- private ObjectMapper objectMapper = new ObjectMapper();
-
- @Test
- public void testImageRecognitionProcessorMobileNetV2() throws IOException {
- List
org.springframework.cloud.fn
- payload-converter-function
- ${java-functions.version}
+ spring-payload-converter-function
+
org.springframework.boot
diff --git a/applications/processor/script-processor/src/main/java/org/springframework/cloud/stream/app/processor/script/ScriptProcessorConfiguration.java b/applications/processor/script-processor/src/main/java/org/springframework/cloud/stream/app/processor/script/ScriptProcessorConfiguration.java
index 9f66d08d1..770974379 100644
--- a/applications/processor/script-processor/src/main/java/org/springframework/cloud/stream/app/processor/script/ScriptProcessorConfiguration.java
+++ b/applications/processor/script-processor/src/main/java/org/springframework/cloud/stream/app/processor/script/ScriptProcessorConfiguration.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2016-2020 the original author or authors.
+ * Copyright 2016-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,6 +30,7 @@
import org.springframework.core.io.Resource;
import org.springframework.integration.handler.MessageProcessor;
import org.springframework.integration.scripting.ScriptVariableGenerator;
+import org.springframework.integration.scripting.dsl.ScriptSpec;
import org.springframework.integration.scripting.dsl.Scripts;
import org.springframework.messaging.Message;
@@ -47,7 +48,7 @@
* @author Artme Bilan
* @author Soby Chacko
*/
-@Configuration
+@Configuration(proxyBeanMethods = false)
@EnableConfigurationProperties(ScriptProcessorProperties.class)
@Import(ScriptVariableGeneratorConfiguration.class)
public class ScriptProcessorConfiguration {
@@ -69,12 +70,12 @@ public ScriptProcessorConfiguration(ScriptProcessorProperties properties, Script
}
@Bean
- public Function, Object> scriptProcessorFunction() {
- return processor()::processMessage;
+ public Function, Object> scriptProcessorFunction(MessageProcessor> messageProcessor) {
+ return messageProcessor::processMessage;
}
@Bean
- public MessageProcessor> processor() {
+ public ScriptSpec processor() {
String language = this.properties.getLanguage();
String script = this.properties.getScript();
logger.info(String.format("Input script is '%s', language is '%s'", script, language));
@@ -82,8 +83,7 @@ public MessageProcessor> processor() {
return Scripts.processor(scriptResource)
.lang(language)
- .variableGenerator(scriptVariableGenerator)
- .get();
+ .variableGenerator(scriptVariableGenerator);
}
private static String decodeScript(String script) {
@@ -94,4 +94,5 @@ private static String decodeScript(String script) {
}
return toProcess.replaceAll(NEWLINE_ESCAPE, "\n").replaceAll(DOUBLE_DOUBLE_QUOTE, "\"");
}
+
}
diff --git a/applications/processor/semantic-segmentation-processor/README.adoc b/applications/processor/semantic-segmentation-processor/README.adoc
deleted file mode 100644
index 1716fceda..000000000
--- a/applications/processor/semantic-segmentation-processor/README.adoc
+++ /dev/null
@@ -1,42 +0,0 @@
-//tag::ref-doc[]
-
-= Semantic Segmentation Processor
-
-Image Semantic Segmentation based on the state-of-art https://github.com/tensorflow/models/tree/master/research/deeplab[DeepLab] Tensorflow model.
-
-The `Semantic Segmentation` is the process of associating each pixel of an image with a class label, (such as flower, person, road, sky, ocean, or car).
-Unlike the `Instance Segmentation`, which produces instance-aware region masks, the `Semantic Segmentation` produces class-aware masks.
-For implementing `Instance Segmentation` consult the https://github.com/spring-cloud/stream-applications/tree/master/functions/function/object-detection-function[Object Detection Service] instead.
-
-The `Semantic Segmentation Processor` uses the https://github.com/spring-cloud/stream-applications/tree/master/functions/function/semantic-segmentation-function[Semantic Segmentation Function] library and the https://github.com/spring-cloud/stream-applications/tree/master/functions/common/tensorflow-common[TensorFlow Service].
-
-== Payload
-
-The incoming type is `byte[]`, and the content type is `application/octet-stream`. The processor processes the input `byte[]` image and outputs augmented `byte[]` image payload and json header.
-
-Processor's input is an image byte array, and the output is an augmented image byte array, and a JSON header `semantic_segmentation` in this format:
-
-[source,json]
-....
-[
- [ 0, 0, 0 ],
- [ 127, 127, 127 ],
- [ 255, 255, 255 ],
- ...
-]
-....
-
-The output header json format represents the color pixel map computed from the input image.
-
-== Options
-
-//tag::configuration-properties[]
-$$semantic.segmentation.color-map-uri$$:: $$Every pre-trained model is based on certain object color maps. The pre-defined options are: - classpath:/colormap/citymap_colormap.json - classpath:/colormap/ade20k_colormap.json - classpath:/colormap/black_white_colormap.json - classpath:/colormap/mapillary_colormap.json$$ *($$String$$, default: `$$$$`)*
-$$semantic.segmentation.debug-output$$:: $$save output image inn the local debugOutputPath path.$$ *($$Boolean$$, default: `$$$$`)*
-$$semantic.segmentation.debug-output-path$$:: $$$$ *($$String$$, default: `$$$$`)*
-$$semantic.segmentation.mask-transparency$$:: $$The alpha color of the computed segmentation mask image.$$ *($$Float$$, default: `$$$$`)*
-$$semantic.segmentation.model$$:: $$pre-trained tensorflow semantic segmentation model.$$ *($$String$$, default: `$$$$`)*
-$$semantic.segmentation.output-type$$:: $$Specifies the output image type. You can return either the input image with the computed mask overlay, or the mask alone.$$ *($$OutputType$$, default: `$$$$`, possible values: `blended`,`mask`)*
-//end::configuration-properties[]
-
-//end::ref-doc[]
diff --git a/applications/processor/semantic-segmentation-processor/pom.xml b/applications/processor/semantic-segmentation-processor/pom.xml
deleted file mode 100644
index 5605cc5ed..000000000
--- a/applications/processor/semantic-segmentation-processor/pom.xml
+++ /dev/null
@@ -1,73 +0,0 @@
-
-
- 4.0.0
- semantic-segmentation-processor
- semantic-segmentation-processor
- Semantic Segmentation (tensorflow) processor apps
- jar
-
-
- org.springframework.cloud.stream.app
- stream-applications-core
- 5.0.0-SNAPSHOT
- ../../stream-applications-core/pom.xml
-
-
-
-
- org.springframework.boot
- spring-boot-configuration-processor
-
-
- org.springframework.cloud.fn
- semantic-segmentation-function
-
-
- org.springframework.cloud.fn
- object-detection-function
-
-
-
-
-
-
-
- org.apache.maven.plugins
- maven-deploy-plugin
- 3.0.0
-
- false
-
-
-
- org.springframework.cloud
- spring-cloud-dataflow-apps-docs-plugin
-
-
- org.springframework.cloud
- spring-cloud-dataflow-apps-generator-plugin
-
-
- semantic-segmentation
- processor
- ${project.version}
- org.springframework.cloud.stream.app.processor.semantic.segmentation.SemanticSegmentationProcessorConfiguration.class
- semanticSegmentationFunction
-
-
-
-
- org.springframework.cloud.stream.app
- semantic-segmentation-processor
- ${project.version}
-
-
-
-
-
-
-
-
-
-
diff --git a/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorConfiguration.java b/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorConfiguration.java
deleted file mode 100644
index 7a6e55546..000000000
--- a/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorConfiguration.java
+++ /dev/null
@@ -1,89 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.semantic.segmentation;
-
-import java.io.FileOutputStream;
-import java.io.IOException;
-import java.util.function.Function;
-
-import org.apache.commons.io.IOUtils;
-import org.apache.commons.logging.Log;
-import org.apache.commons.logging.LogFactory;
-
-import org.springframework.boot.context.properties.EnableConfigurationProperties;
-import org.springframework.cloud.fn.common.tensorflow.deprecated.JsonMapperFunction;
-import org.springframework.cloud.fn.semantic.segmentation.SegmentationColorMap;
-import org.springframework.cloud.fn.semantic.segmentation.SemanticSegmentation;
-import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Configuration;
-import org.springframework.integration.support.MessageBuilder;
-import org.springframework.messaging.Message;
-
-/**
- * @author Christian Tzolov
- */
-@Configuration
-@EnableConfigurationProperties(SemanticSegmentationProcessorProperties.class)
-public class SemanticSegmentationProcessorConfiguration {
-
- private static final Log logger = LogFactory.getLog(SemanticSegmentationProcessorConfiguration.class);
-
- /**
- * Output header name.
- */
- public static final String SEMANTIC_SEGMENTATION_HEADER = "semantic_segmentation";
-
- @Bean
- public SemanticSegmentation semanticSegmentation(SemanticSegmentationProcessorProperties properties) {
- return new SemanticSegmentation(properties.getModel(),
- SegmentationColorMap.loadColorMap(properties.getColorMapUri()), null,
- properties.getMaskTransparency());
- }
-
- @Bean
- public Function, Message> semanticSegmentationFunction(
- SemanticSegmentation semanticSegmentation,
- SemanticSegmentationProcessorProperties properties) {
-
- return input -> {
- // You can use file:, http: or classpath: to provide the path to the input image.
- byte[] inputImage = input.getPayload();
-
- byte[] outputImage = (properties.getOutputType() == SemanticSegmentationProcessorProperties.OutputType.blended) ?
- semanticSegmentation.blendMask(inputImage) : semanticSegmentation.maskImage(inputImage);
-
- long[][] maskPixels = semanticSegmentation.maskPixels(inputImage);
- String jsonMaskPixels = new JsonMapperFunction().apply(maskPixels);
-
- Message outMessage = MessageBuilder
- .withPayload(outputImage)
- .setHeader(SEMANTIC_SEGMENTATION_HEADER, jsonMaskPixels)
- .build();
-
- if (properties.isDebugOutput()) {
- try {
- IOUtils.write(outputImage, new FileOutputStream(properties.getDebugOutputPath()));
- }
- catch (IOException e) {
- logger.warn("Cloud not produce debug output", e);
- }
- }
-
- return outMessage;
- };
- }
-}
diff --git a/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorProperties.java b/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorProperties.java
deleted file mode 100644
index fec7b807b..000000000
--- a/applications/processor/semantic-segmentation-processor/src/main/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorProperties.java
+++ /dev/null
@@ -1,117 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.semantic.segmentation;
-
-import org.springframework.boot.context.properties.ConfigurationProperties;
-import org.springframework.validation.annotation.Validated;
-
-/**
- * @author Christian Tzolov
- */
-@ConfigurationProperties("semantic.segmentation")
-@Validated
-public class SemanticSegmentationProcessorProperties {
-
- enum OutputType {
- /** Input image augmented with the segmentation mask on top. */
- blended,
- /** Image of the segmentation mask. */
- mask
- }
-
- /**
- * pre-trained tensorflow semantic segmentation model.
- */
- private String model = "https://download.tensorflow.org/models/deeplabv3_mnv2_cityscapes_train_2018_02_05.tar.gz#frozen_inference_graph.pb";
-
- /**
- * Specifies the output image type. You can return either the input image with the computed mask overlay, or
- * the mask alone.
- */
- private OutputType outputType = OutputType.blended;
-
- /**
- * Every pre-trained model is based on certain object color maps.
- * The pre-defined options are:
- * - classpath:/colormap/citymap_colormap.json
- * - classpath:/colormap/ade20k_colormap.json
- * - classpath:/colormap/black_white_colormap.json
- * - classpath:/colormap/mapillary_colormap.json
- */
- private String colorMapUri = "classpath:/colormap/citymap_colormap.json";
-
- /**
- * The alpha color of the computed segmentation mask image.
- */
- private float maskTransparency = 0.45f;
-
- /**
- * save output image inn the local debugOutputPath path.
- */
- private boolean debugOutput = false;
-
- private String debugOutputPath = "semantic-segmentation-result.png";
-
- public OutputType getOutputType() {
- return outputType;
- }
-
- public void setOutputType(OutputType outputType) {
- this.outputType = outputType;
- }
-
- public boolean isDebugOutput() {
- return debugOutput;
- }
-
- public void setDebugOutput(boolean debugOutput) {
- this.debugOutput = debugOutput;
- }
-
- public String getDebugOutputPath() {
- return debugOutputPath;
- }
-
- public void setDebugOutputPath(String debugOutputPath) {
- this.debugOutputPath = debugOutputPath;
- }
-
- public String getModel() {
- return model;
- }
-
- public void setModel(String model) {
- this.model = model;
- }
-
- public String getColorMapUri() {
- return colorMapUri;
- }
-
- public void setColorMapUri(String colorMapUri) {
- this.colorMapUri = colorMapUri;
- }
-
- public float getMaskTransparency() {
- return maskTransparency;
- }
-
- public void setMaskTransparency(float maskTransparency) {
- this.maskTransparency = maskTransparency;
- }
-
-}
diff --git a/applications/processor/semantic-segmentation-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties b/applications/processor/semantic-segmentation-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties
deleted file mode 100644
index 104f7436f..000000000
--- a/applications/processor/semantic-segmentation-processor/src/main/resources/META-INF/dataflow-configuration-metadata.properties
+++ /dev/null
@@ -1,2 +0,0 @@
-configuration-properties.classes=\
- org.springframework.cloud.stream.app.processor.semantic.segmentation.SemanticSegmentationProcessorProperties
diff --git a/applications/processor/semantic-segmentation-processor/src/test/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorTests.java b/applications/processor/semantic-segmentation-processor/src/test/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorTests.java
deleted file mode 100644
index d95adc2f1..000000000
--- a/applications/processor/semantic-segmentation-processor/src/test/java/org/springframework/cloud/stream/app/processor/semantic/segmentation/SemanticSegmentationProcessorTests.java
+++ /dev/null
@@ -1,76 +0,0 @@
-/*
- * Copyright 2020-2020 the original author or authors.
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * https://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.springframework.cloud.stream.app.processor.semantic.segmentation;
-
-import java.io.IOException;
-
-import org.junit.jupiter.api.Test;
-
-import org.springframework.boot.WebApplicationType;
-import org.springframework.boot.autoconfigure.SpringBootApplication;
-import org.springframework.boot.builder.SpringApplicationBuilder;
-import org.springframework.cloud.fn.common.tensorflow.deprecated.GraphicsUtils;
-import org.springframework.cloud.stream.binder.test.InputDestination;
-import org.springframework.cloud.stream.binder.test.OutputDestination;
-import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
-import org.springframework.context.ConfigurableApplicationContext;
-import org.springframework.context.annotation.Import;
-import org.springframework.messaging.Message;
-import org.springframework.messaging.support.GenericMessage;
-
-import static org.assertj.core.api.Assertions.assertThat;
-
-/**
- * @author Christian Tzolov
- */
-public class SemanticSegmentationProcessorTests {
-
- @Test
- public void testSemanticSegmentationProcessor() throws IOException {
- try (ConfigurableApplicationContext context = new SpringApplicationBuilder(
- TestChannelBinderConfiguration.getCompleteConfiguration(SemanticSegmentationTestApplication.class))
- .web(WebApplicationType.NONE)
- .run("--spring.cloud.function.definition=semanticSegmentationFunction",
- "--semantic.segmentation.model=https://download.tensorflow.org/models/deeplabv3_mnv2_cityscapes_train_2018_02_05.tar.gz#frozen_inference_graph.pb",
- "--semantic.segmentation.colorMapUri=classpath:/colormap/citymap_colormap.json",
- "--semantic.segmentation.outputType=blended",
- "--semantic.segmentation.debugOutput=true",
- "--semantic.segmentation.debugOutputPath=./target/semantic-segmentation-1.png")) {
-
- InputDestination processorInput = context.getBean(InputDestination.class);
- OutputDestination processorOutput = context.getBean(OutputDestination.class);
-
- byte[] inputImage = GraphicsUtils.loadAsByteArray("classpath:/images/amsterdam-cityscape1.jpg");
- processorInput.send(new GenericMessage<>(inputImage));
- Message sourceMessage = processorOutput.receive(10000);
- String jsonRecognizedObjects = (String) sourceMessage.getHeaders().get(
- SemanticSegmentationProcessorConfiguration.SEMANTIC_SEGMENTATION_HEADER);
- assertThat(jsonRecognizedObjects).isNotEmpty();
- //assertThat(jsonRecognizedObjects)
- // .isEqualTo("[{\"label\":\"giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\",\"probability\":0.962329626083374}," +
- // "{\"label\":\"badger\",\"probability\":0.006058811210095882}," +
- // "{\"label\":\"ram, tup\",\"probability\":0.0010668420000001788}]");
- }
- }
-
-
- @SpringBootApplication
- @Import(SemanticSegmentationProcessorConfiguration.class)
- public static class SemanticSegmentationTestApplication {
- }
-
-}
diff --git a/applications/processor/semantic-segmentation-processor/src/test/resources/images/amsterdam-cityscape1.jpg b/applications/processor/semantic-segmentation-processor/src/test/resources/images/amsterdam-cityscape1.jpg
deleted file mode 100644
index d77cae406..000000000
Binary files a/applications/processor/semantic-segmentation-processor/src/test/resources/images/amsterdam-cityscape1.jpg and /dev/null differ
diff --git a/applications/processor/splitter-processor/README.adoc b/applications/processor/splitter-processor/README.adoc
index 482fa9da6..d82b18b23 100644
--- a/applications/processor/splitter-processor/README.adoc
+++ b/applications/processor/splitter-processor/README.adoc
@@ -20,7 +20,7 @@ If the incoming type is `byte[]` and the content type is set to `text/plain` or
$$splitter.apply-sequence$$:: $$Add correlation/sequence information in headers to facilitate later aggregation.$$ *($$Boolean$$, default: `$$true$$`)*
$$splitter.charset$$:: $$The charset to use when converting bytes in text-based files to String.$$ *($$String$$, default: `$$$$`)*
$$splitter.delimiters$$:: $$When expression is null, delimiters to use when tokenizing {@link String} payloads.$$ *($$String$$, default: `$$$$`)*
-$$splitter.expression$$:: $$A SpEL expression for splitting payloads.$$ *($$String$$, default: `$$$$`)*
+$$splitter.expression$$:: $$A SpEL expression for splitting payloads.$$ *($$Expression$$, default: `$$$$`)*
$$splitter.file-markers$$:: $$Set to true or false to use a {@code FileSplitter} (to split text-based files by line) that includes (or not) beginning/end of file markers.$$ *($$Boolean$$, default: `$$$$`)*
$$splitter.markers-json$$:: $$When 'fileMarkers == true', specify if they should be produced as FileSplitter.FileMarker objects or JSON.$$ *($$Boolean$$, default: `$$true$$`)*
//end::configuration-properties[]
diff --git a/applications/processor/splitter-processor/pom.xml b/applications/processor/splitter-processor/pom.xml
index 2ba8e99e4..6237244b3 100644
--- a/applications/processor/splitter-processor/pom.xml
+++ b/applications/processor/splitter-processor/pom.xml
@@ -17,11 +17,11 @@
org.springframework.cloud.fn
- splitter-function
+ spring-splitter-function
org.springframework.cloud.fn
- payload-converter-function
+ spring-payload-converter-function
@@ -47,19 +47,18 @@
splitter
processor
${project.version}
- org.springframework.cloud.fn.splitter.SplitterFunctionConfiguration.class
-
+ AUTOCONFIGURATION
byteArrayTextToString|splitterFunction
org.springframework.cloud.fn
- payload-converter-function
+ spring-payload-converter-function
org.springframework.cloud.fn
- splitter-function
+ spring-splitter-function
diff --git a/applications/processor/transform-processor/README.adoc b/applications/processor/transform-processor/README.adoc
index 4a4eeb4fe..b998f0ecb 100644
--- a/applications/processor/transform-processor/README.adoc
+++ b/applications/processor/transform-processor/README.adoc
@@ -21,7 +21,7 @@ The incoming message can contain any type of payload.
== Options
//tag::configuration-properties[]
-$$spel.function.expression$$:: $$A SpEL expression to apply.$$ *($$String$$, default: `$$$$`)*
+$$spel.function.expression$$:: $$A SpEL expression to apply.$$ *($$Expression$$, default: `$$$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/processor/transform-processor/pom.xml b/applications/processor/transform-processor/pom.xml
index 01bb02fd2..05fabf930 100644
--- a/applications/processor/transform-processor/pom.xml
+++ b/applications/processor/transform-processor/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- spel-function
+ spring-spel-function
@@ -43,13 +43,13 @@
transform
processor
${project.version}
- org.springframework.cloud.fn.spel.SpelFunctionConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|spelFunction
org.springframework.cloud.fn
- spel-function
+ spring-spel-function
diff --git a/applications/processor/twitter-trend-processor/README.adoc b/applications/processor/twitter-trend-processor/README.adoc
index 691466f86..780f448e6 100644
--- a/applications/processor/twitter-trend-processor/README.adoc
+++ b/applications/processor/twitter-trend-processor/README.adoc
@@ -31,12 +31,12 @@ Properties grouped by prefix:
=== twitter.trend.closest
-$$lat$$:: $$If provided with a long parameter the available trend locations will be sorted by distance, nearest to furthest, to the co-ordinate pair. The valid ranges for longitude is -180.0 to +180.0 (West is negative, East is positive) inclusive.$$ *($$Expression$$, default: `$$$$`)*
-$$lon$$:: $$If provided with a lat parameter the available trend locations will be sorted by distance, nearest to furthest, to the co-ordinate pair. The valid ranges for longitude is -180.0 to +180.0 (West is negative, East is positive) inclusive.$$ *($$Expression$$, default: `$$$$`)*
+$$lat$$:: $$If provided with a long parameter the available trend locations will be sorted by distance, nearest to the furthest, to the co-ordinate pair. The valid ranges for longitude is -180.0 to +180.0 (West is negative, East is positive) inclusive.$$ *($$Expression$$, default: `$$$$`)*
+$$lon$$:: $$If provided with a lat parameter the available trend locations will be sorted by distance, nearest to the furthest, to the co-ordinate pair. The valid ranges for longitude is -180.0 to +180.0 (West is negative, East is positive) inclusive.$$ *($$Expression$$, default: `$$$$`)*
=== twitter.trend
-$$location-id$$:: $$The Yahoo! Where On Earth ID of the location to return trending information for. Global information is available by using 1 as the WOEID.$$ *($$Expression$$, default: `$$payload$$`)*
+$$location-id$$:: $$The Yahoo! Where On Earth ID of the location to return trending information for. Global information is available by using 1 as the WOEID.$$ *($$Expression$$, default: `$$$$`)*
$$trend-query-type$$:: $$$$ *($$TrendQueryType$$, default: `$$$$`, possible values: `trend`,`trendLocation`)*
//end::configuration-properties[]
diff --git a/applications/processor/twitter-trend-processor/pom.xml b/applications/processor/twitter-trend-processor/pom.xml
index b3238de76..d983c1b7c 100644
--- a/applications/processor/twitter-trend-processor/pom.xml
+++ b/applications/processor/twitter-trend-processor/pom.xml
@@ -18,7 +18,7 @@
org.springframework.cloud.fn
- twitter-function
+ spring-twitter-function
org.springframework.boot
@@ -28,13 +28,6 @@
org.mock-server
mockserver-netty
- ${mockserver.version}
- test
-
-
- org.mock-server
- mockserver-client-java
- ${mockserver.version}
test
@@ -61,14 +54,14 @@
twitter-trend
processor
${project.version}
- org.springframework.cloud.fn.twitter.trend.TwitterTrendFunctionConfiguration.class
+ AUTOCONFIGURATION
twitterTrendFunction
org.springframework.cloud.fn
- twitter-function
+ spring-twitter-function
diff --git a/applications/processor/twitter-trend-processor/src/test/java/org/springframework/cloud/stream/app/processor/twitter/trend/TwitterTrendProcessorIntegrationTests.java b/applications/processor/twitter-trend-processor/src/test/java/org/springframework/cloud/stream/app/processor/twitter/trend/TwitterTrendProcessorIntegrationTests.java
index 723027e36..94a968c4e 100644
--- a/applications/processor/twitter-trend-processor/src/test/java/org/springframework/cloud/stream/app/processor/twitter/trend/TwitterTrendProcessorIntegrationTests.java
+++ b/applications/processor/twitter-trend-processor/src/test/java/org/springframework/cloud/stream/app/processor/twitter/trend/TwitterTrendProcessorIntegrationTests.java
@@ -18,7 +18,6 @@
import java.nio.charset.StandardCharsets;
import java.time.Duration;
-import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import org.junit.jupiter.api.AfterAll;
@@ -36,17 +35,14 @@
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.cloud.fn.common.twitter.TwitterConnectionProperties;
import org.springframework.cloud.fn.common.twitter.util.TwitterTestUtils;
-import org.springframework.cloud.fn.twitter.trend.TwitterTrendFunctionConfiguration;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.OutputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Primary;
import org.springframework.messaging.Message;
import org.springframework.messaging.support.GenericMessage;
-import org.springframework.test.util.TestSocketUtils;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockserver.matchers.Times.exactly;
@@ -56,22 +52,20 @@
/**
* @author Christian Tzolov
+ * @author Artem Bilan
*/
public class TwitterTrendProcessorIntegrationTests {
- private static final String MOCK_SERVER_IP = "127.0.0.1";
-
- private static final Integer MOCK_SERVER_PORT = TestSocketUtils.findAvailableTcpPort();
-
private static ClientAndServer mockServer;
private static MockServerClient mockClient;
+
private static HttpRequest trendsRequest;
@BeforeAll
public static void startServer() {
- mockServer = ClientAndServer.startClientAndServer(MOCK_SERVER_PORT);
- mockClient = new MockServerClient(MOCK_SERVER_IP, MOCK_SERVER_PORT);
+ mockServer = ClientAndServer.startClientAndServer();
+ mockClient = new MockServerClient("localhost", mockServer.getPort());
trendsRequest = setExpectation(request()
.withMethod("GET")
@@ -91,6 +85,7 @@ public void testTwitterTrendPayload() {
.web(WebApplicationType.NONE)
.run("--spring.cloud.function.definition=twitterTrendFunction",
+ "--twitter.trend.trend-query-type=trend",
"--twitter.trend.locationId='2972'",
"--twitter.connection.rawJson=true",
@@ -110,11 +105,6 @@ public void testTwitterTrendPayload() {
assertThat(outputMessage).isNotNull();
mockClient.verify(trendsRequest, once());
-
- //Resource trendsResource = new DefaultResourceLoader().getResource("classpath:/response/trends.json");
- //String expected = new String(StreamUtils.copyToByteArray(trendsResource.getInputStream()), StandardCharsets.UTF_8).trim();
- //String actual = new String(outputMessage.getPayload(), StandardCharsets.UTF_8);
- //JSONAssert.assertEquals(expected, actual, JSONCompareMode.LENIENT);
}
}
@@ -127,15 +117,14 @@ public static HttpRequest setExpectation(HttpRequest request) {
new Header("Content-Type", "application/json; charset=utf-8"),
new Header("Cache-Control", "public, max-age=86400"))
.withBody(TwitterTestUtils.asString("classpath:/response/trends.json"))
- .withDelay(TimeUnit.SECONDS, 1)
);
return request;
}
@SpringBootConfiguration
@EnableAutoConfiguration
- @Import(TwitterTrendFunctionConfiguration.class)
public static class TestTwitterTrendProcessorApplication {
+
@Bean
@Primary
public twitter4j.conf.Configuration twitterConfiguration2(TwitterConnectionProperties properties,
@@ -143,11 +132,11 @@ public twitter4j.conf.Configuration twitterConfiguration2(TwitterConnectionPrope
Function mockedConfiguration =
toConfigurationBuilder.andThen(
- new TwitterTestUtils().mockTwitterUrls(
- String.format("http://%s:%s", MOCK_SERVER_IP, MOCK_SERVER_PORT)));
+ new TwitterTestUtils().mockTwitterUrls("http://localhost:" + mockServer.getPort()));
return mockedConfiguration.apply(properties).build();
}
+
}
}
diff --git a/applications/sink/analytics-sink/pom.xml b/applications/sink/analytics-sink/pom.xml
index 40c660015..2987b4d6a 100644
--- a/applications/sink/analytics-sink/pom.xml
+++ b/applications/sink/analytics-sink/pom.xml
@@ -18,7 +18,7 @@
org.springframework.cloud.fn
- analytics-consumer
+ spring-analytics-consumer
org.awaitility
@@ -55,13 +55,13 @@
analytics
sink
${project.version}
- org.springframework.cloud.fn.consumer.analytics.AnalyticsConsumerConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|analyticsConsumer
org.springframework.cloud.fn
- analytics-consumer
+ spring-analytics-consumer
diff --git a/applications/sink/cassandra-sink/README.adoc b/applications/sink/cassandra-sink/README.adoc
index b9b648230..703bf16f3 100644
--- a/applications/sink/cassandra-sink/README.adoc
+++ b/applications/sink/cassandra-sink/README.adoc
@@ -24,7 +24,7 @@ $$entity-base-packages$$:: $$Base packages to scan for entities annotated with T
$$init-script$$:: $$Resource with CQL scripts (delimited by ';') to initialize keyspace schema.$$ *($$Resource$$, default: `$$$$`)*
$$skip-ssl-validation$$:: $$Flag to validate the Servers' SSL certs.$$ *($$Boolean$$, default: `$$false$$`)*
-=== cassandra
+=== cassandra.consumer
$$consistency-level$$:: $$The consistency level for write operation.$$ *($$ConsistencyLevel$$, default: `$$$$`)*
$$ingest-query$$:: $$Ingest Cassandra query.$$ *($$String$$, default: `$$$$`)*
diff --git a/applications/sink/cassandra-sink/pom.xml b/applications/sink/cassandra-sink/pom.xml
index a85a7047f..df604d710 100644
--- a/applications/sink/cassandra-sink/pom.xml
+++ b/applications/sink/cassandra-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- cassandra-consumer
+ spring-cassandra-consumer
@@ -43,15 +43,13 @@
cassandra
sink
${project.version}
-
- org.springframework.cloud.fn.consumer.cassandra.CassandraConsumerConfiguration.class
-
-
+ AUTOCONFIGURATION
+ cassandraConsumer
org.springframework.cloud.fn
- cassandra-consumer
+ spring-cassandra-consumer
diff --git a/applications/sink/elasticsearch-sink/README.adoc b/applications/sink/elasticsearch-sink/README.adoc
index e7a63e4b0..bfd690964 100644
--- a/applications/sink/elasticsearch-sink/README.adoc
+++ b/applications/sink/elasticsearch-sink/README.adoc
@@ -20,11 +20,11 @@ Properties grouped by prefix:
=== elasticsearch.consumer
-$$async$$:: $$Indicates whether the indexing operation is async or not. By default indexing is done synchronously.$$ *($$Boolean$$, default: `$$false$$`)*
+$$async$$:: $$Indicates whether the indexing operation is async or not. By default, indexing is done synchronously.$$ *($$Boolean$$, default: `$$false$$`)*
$$batch-size$$:: $$Number of items to index for each request. It defaults to 1. For values greater than 1 bulk indexing API will be used.$$ *($$Integer$$, default: `$$1$$`)*
$$group-timeout$$:: $$Timeout in milliseconds after which message group is flushed when bulk indexing is active. It defaults to -1, meaning no automatic flush of idle message groups occurs.$$ *($$Long$$, default: `$$-1$$`)*
-$$id$$:: $$The id of the document to index. If set, the INDEX_ID header value overrides this property on a per message basis.$$ *($$Expression$$, default: `$$$$`)*
-$$index$$:: $$Name of the index. If set, the INDEX_NAME header value overrides this property on a per message basis.$$ *($$String$$, default: `$$$$`)*
+$$id$$:: $$The id of the document to index. If set, the INDEX_ID header value overrides this property on a per-message basis.$$ *($$Expression$$, default: `$$$$`)*
+$$index$$:: $$Name of the index. If set, the INDEX_NAME header value overrides this property on a per-message basis.$$ *($$String$$, default: `$$$$`)*
$$routing$$:: $$Indicates the shard to route to. If not provided, Elasticsearch will default to a hash of the document id.$$ *($$String$$, default: `$$$$`)*
$$timeout-seconds$$:: $$Timeout for the shard to be available. If not set, it defaults to 1 minute set by the Elasticsearch client.$$ *($$Long$$, default: `$$0$$`)*
diff --git a/applications/sink/elasticsearch-sink/pom.xml b/applications/sink/elasticsearch-sink/pom.xml
index 01fa91c34..cf117e5a2 100644
--- a/applications/sink/elasticsearch-sink/pom.xml
+++ b/applications/sink/elasticsearch-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- elasticsearch-consumer
+ spring-elasticsearch-consumer
org.awaitility
@@ -72,14 +72,14 @@
elasticsearch
sink
${project.version}
- org.springframework.cloud.fn.consumer.elasticsearch.ElasticsearchConsumerConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|elasticsearchConsumer
org.springframework.cloud.fn
- elasticsearch-consumer
+ spring-elasticsearch-consumer
diff --git a/applications/sink/file-sink/README.adoc b/applications/sink/file-sink/README.adoc
index 20487057c..889495f58 100644
--- a/applications/sink/file-sink/README.adoc
+++ b/applications/sink/file-sink/README.adoc
@@ -15,10 +15,10 @@ The file sink app writes each message it receives to a file.
The `file-sink` has the following options:
//tag::configuration-properties[]
-$$file.consumer.binary$$:: $$A flag to indicate whether adding a newline after the write should be suppressed.$$ *($$Boolean$$, default: `$$false$$`)*
+$$file.consumer.binary$$:: $$A flag to indicate whether adding a newline after the write operation should be suppressed.$$ *($$Boolean$$, default: `$$false$$`)*
$$file.consumer.charset$$:: $$The charset to use when writing text content.$$ *($$String$$, default: `$$UTF-8$$`)*
$$file.consumer.directory$$:: $$The parent directory of the target file.$$ *($$File$$, default: `$$$$`)*
-$$file.consumer.directory-expression$$:: $$The expression to evaluate for the parent directory of the target file.$$ *($$String$$, default: `$$$$`)*
+$$file.consumer.directory-expression$$:: $$The expression to evaluate for the parent directory of the target file.$$ *($$Expression$$, default: `$$$$`)*
$$file.consumer.mode$$:: $$The FileExistsMode to use if the target file already exists.$$ *($$FileExistsMode$$, default: `$$$$`, possible values: `APPEND`,`APPEND_NO_FLUSH`,`FAIL`,`IGNORE`,`REPLACE`,`REPLACE_IF_MODIFIED`)*
$$file.consumer.name$$:: $$The name of the target file.$$ *($$String$$, default: `$$file-consumer$$`)*
$$file.consumer.name-expression$$:: $$The expression to evaluate for the name of the target file.$$ *($$String$$, default: `$$$$`)*
diff --git a/applications/sink/file-sink/pom.xml b/applications/sink/file-sink/pom.xml
index 9efc18a3f..269f46f1f 100644
--- a/applications/sink/file-sink/pom.xml
+++ b/applications/sink/file-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- file-consumer
+ spring-file-consumer
@@ -43,13 +43,13 @@
file
sink
${project.version}
- org.springframework.cloud.fn.consumer.file.FileConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ fileConsumer
org.springframework.cloud.fn
- file-consumer
+ spring-file-consumer
diff --git a/applications/sink/ftp-sink/README.adoc b/applications/sink/ftp-sink/README.adoc
index c6c38c697..3cda28c82 100644
--- a/applications/sink/ftp-sink/README.adoc
+++ b/applications/sink/ftp-sink/README.adoc
@@ -37,14 +37,14 @@ Properties grouped by prefix:
=== ftp.consumer
-$$auto-create-dir$$:: $$Whether or not to create the remote directory.$$ *($$Boolean$$, default: `$$true$$`)*
+$$auto-create-dir$$:: $$Whether to create the remote directory.$$ *($$Boolean$$, default: `$$true$$`)*
$$filename-expression$$:: $$A SpEL expression to generate the remote file name.$$ *($$String$$, default: `$$$$`)*
$$mode$$:: $$Action to take if the remote file already exists.$$ *($$FileExistsMode$$, default: `$$$$`, possible values: `APPEND`,`APPEND_NO_FLUSH`,`FAIL`,`IGNORE`,`REPLACE`,`REPLACE_IF_MODIFIED`)*
$$remote-dir$$:: $$The remote FTP directory.$$ *($$String$$, default: `$$/$$`)*
$$remote-file-separator$$:: $$The remote file separator.$$ *($$String$$, default: `$$/$$`)*
$$temporary-remote-dir$$:: $$A temporary directory where the file will be written if '#isUseTemporaryFilename()' is true.$$ *($$String$$, default: `$$/$$`)*
$$tmp-file-suffix$$:: $$The suffix to use while the transfer is in progress.$$ *($$String$$, default: `$$.tmp$$`)*
-$$use-temporary-filename$$:: $$Whether or not to write to a temporary file and rename.$$ *($$Boolean$$, default: `$$true$$`)*
+$$use-temporary-filename$$:: $$Whether to write to a temporary file and rename.$$ *($$Boolean$$, default: `$$true$$`)*
=== ftp.factory
diff --git a/applications/sink/ftp-sink/pom.xml b/applications/sink/ftp-sink/pom.xml
index f5ad426ed..6f417a0e6 100644
--- a/applications/sink/ftp-sink/pom.xml
+++ b/applications/sink/ftp-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- ftp-consumer
+ spring-ftp-consumer
@@ -43,12 +43,13 @@
ftp
sink
${project.version}
- org.springframework.cloud.fn.consumer.ftp.FtpConsumerConfiguration.class
+ AUTOCONFIGURATION
+ ftpConsumer
org.springframework.cloud.fn
- ftp-consumer
+ spring-ftp-consumer
diff --git a/applications/sink/jdbc-sink/pom.xml b/applications/sink/jdbc-sink/pom.xml
index e388ff359..e8df6b76c 100644
--- a/applications/sink/jdbc-sink/pom.xml
+++ b/applications/sink/jdbc-sink/pom.xml
@@ -18,7 +18,7 @@
org.springframework.cloud.fn
- jdbc-consumer
+ spring-jdbc-consumer
com.h2database
@@ -54,12 +54,13 @@
jdbc
sink
${project.version}
- org.springframework.cloud.fn.consumer.jdbc.JdbcConsumerConfiguration.class
+ AUTOCONFIGURATION
+ jdbcConsumer
org.springframework.cloud.fn
- jdbc-consumer
+ spring-jdbc-consumer
diff --git a/applications/sink/kafka-sink/README.adoc b/applications/sink/kafka-sink/README.adoc
index 52f337c40..90638632a 100644
--- a/applications/sink/kafka-sink/README.adoc
+++ b/applications/sink/kafka-sink/README.adoc
@@ -51,6 +51,7 @@ $$value-serializer$$:: $$Serializer class for values.$$ *($$Class>$$, default:
=== spring.kafka.template
$$default-topic$$:: $$Default topic to which messages are sent.$$ *($$String$$, default: `$$$$`)*
+$$observation-enabled$$:: $$Whether to enable observation.$$ *($$Boolean$$, default: `$$false$$`)*
$$transaction-id-prefix$$:: $$Transaction id prefix, override the transaction id prefix in the producer factory.$$ *($$String$$, default: `$$$$`)*
//end::configuration-properties[]
diff --git a/applications/sink/kafka-sink/pom.xml b/applications/sink/kafka-sink/pom.xml
index 99588869b..5bcd294fe 100644
--- a/applications/sink/kafka-sink/pom.xml
+++ b/applications/sink/kafka-sink/pom.xml
@@ -19,7 +19,7 @@
org.springframework.cloud.fn
- kafka-publisher
+ spring-kafka-publisher
diff --git a/applications/sink/kafka-sink/src/test/java/org/springframework/cloud/stream/app/sink/kafka/KafkaSinkTests.java b/applications/sink/kafka-sink/src/test/java/org/springframework/cloud/stream/app/sink/kafka/KafkaSinkTests.java
index 2416e8ebd..21d516706 100644
--- a/applications/sink/kafka-sink/src/test/java/org/springframework/cloud/stream/app/sink/kafka/KafkaSinkTests.java
+++ b/applications/sink/kafka-sink/src/test/java/org/springframework/cloud/stream/app/sink/kafka/KafkaSinkTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2023-2023 the original author or authors.
+ * Copyright 2023-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -46,7 +46,7 @@
"kafka.publisher.topic=" + KafkaSinkTests.TEST_TOPIC,
"kafka.publisher.mappedHeaders=mapped"
})
-@EmbeddedKafka(bootstrapServersProperty = "spring.kafka.bootstrap-servers")
+@EmbeddedKafka(kraft = false)
@DirtiesContext
public class KafkaSinkTests {
diff --git a/applications/sink/log-sink/README.adoc b/applications/sink/log-sink/README.adoc
index 44b7671e6..75d217a86 100644
--- a/applications/sink/log-sink/README.adoc
+++ b/applications/sink/log-sink/README.adoc
@@ -13,9 +13,9 @@ The **$$log$$** $$sink$$ has the following options:
//tag::configuration-properties[]
-$$log.expression$$:: $$A SpEL expression (against the incoming message) to evaluate as the logged message.$$ *($$String$$, default: `$$payload$$`)*
-$$log.level$$:: $$The level at which to log messages.$$ *($$Level$$, default: `$$$$`, possible values: `FATAL`,`ERROR`,`WARN`,`INFO`,`DEBUG`,`TRACE`)*
-$$log.name$$:: $$The name of the logger to use.$$ *($$String$$, default: `$$$$`)*
+$$log.consumer.expression$$:: $$A SpEL expression (against the incoming message) to evaluate as the logged message.$$ *($$String$$, default: `$$payload$$`)*
+$$log.consumer.level$$:: $$The level at which to log messages.$$ *($$Level$$, default: `$$$$`, possible values: `FATAL`,`ERROR`,`WARN`,`INFO`,`DEBUG`,`TRACE`)*
+$$log.consumer.name$$:: $$The name of the logger to use.$$ *($$String$$, default: `$$$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/sink/log-sink/pom.xml b/applications/sink/log-sink/pom.xml
index 2f7907f4c..44e004d1b 100644
--- a/applications/sink/log-sink/pom.xml
+++ b/applications/sink/log-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- log-consumer
+ spring-log-consumer
org.awaitility
@@ -54,14 +54,14 @@
log
sink
${project.version}
- org.springframework.cloud.fn.consumer.log.LogConsumerConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|logConsumer
org.springframework.cloud.fn
- log-consumer
+ spring-log-consumer
diff --git a/applications/sink/mongodb-sink/pom.xml b/applications/sink/mongodb-sink/pom.xml
index d0fc45974..34f418395 100644
--- a/applications/sink/mongodb-sink/pom.xml
+++ b/applications/sink/mongodb-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- mongodb-consumer
+ spring-mongodb-consumer
io.projectreactor
@@ -48,14 +48,13 @@
mongodb
sink
${project.version}
- org.springframework.cloud.fn.consumer.mongo.MongoDbConsumerConfiguration.class
-
+ AUTOCONFIGURATION
byteArrayTextToString|mongodbConsumer
org.springframework.cloud.fn
- mongodb-consumer
+ spring-mongodb-consumer
org.springframework.cloud.fn
diff --git a/applications/sink/mqtt-sink/README.adoc b/applications/sink/mqtt-sink/README.adoc
index fc8fd079e..ebdaa5a2c 100644
--- a/applications/sink/mqtt-sink/README.adoc
+++ b/applications/sink/mqtt-sink/README.adoc
@@ -30,12 +30,12 @@ $$username$$:: $$the username to use when connecting to the broker.$$ *($$String
=== mqtt.consumer
-$$async$$:: $$whether or not to use async sends.$$ *($$Boolean$$, default: `$$false$$`)*
-$$charset$$:: $$the charset used to convert a String payload to byte[].$$ *($$String$$, default: `$$UTF-8$$`)*
-$$client-id$$:: $$identifies the client.$$ *($$String$$, default: `$$stream.client.id.sink$$`)*
-$$qos$$:: $$the quality of service to use.$$ *($$Integer$$, default: `$$1$$`)*
-$$retained$$:: $$whether to set the 'retained' flag.$$ *($$Boolean$$, default: `$$false$$`)*
-$$topic$$:: $$the topic to which the sink will publish.$$ *($$String$$, default: `$$stream.mqtt$$`)*
+$$async$$:: $$Whether to use async sends.$$ *($$Boolean$$, default: `$$false$$`)*
+$$charset$$:: $$The charset used to convert a String payload to byte[].$$ *($$String$$, default: `$$UTF-8$$`)*
+$$client-id$$:: $$Identifies the client.$$ *($$String$$, default: `$$stream.client.id.sink$$`)*
+$$qos$$:: $$The quality of service to use.$$ *($$Integer$$, default: `$$1$$`)*
+$$retained$$:: $$Whether to set the 'retained' flag.$$ *($$Boolean$$, default: `$$false$$`)*
+$$topic$$:: $$The topic to which the sink will publish.$$ *($$String$$, default: `$$stream.mqtt$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/sink/mqtt-sink/pom.xml b/applications/sink/mqtt-sink/pom.xml
index 4a751ded4..bc3f46c63 100644
--- a/applications/sink/mqtt-sink/pom.xml
+++ b/applications/sink/mqtt-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- mqtt-consumer
+ spring-mqtt-consumer
org.testcontainers
@@ -49,13 +49,13 @@
mqtt
sink
${project.version}
- org.springframework.cloud.fn.consumer.mqtt.MqttConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ mqttConsumer
org.springframework.cloud.fn
- mqtt-consumer
+ spring-mqtt-consumer
diff --git a/applications/sink/pgcopy-sink/pom.xml b/applications/sink/pgcopy-sink/pom.xml
index 037e5dc40..b2aafc3de 100644
--- a/applications/sink/pgcopy-sink/pom.xml
+++ b/applications/sink/pgcopy-sink/pom.xml
@@ -28,11 +28,6 @@
postgresql
compile
-
- org.springframework.cloud
- spring-cloud-stream-test-binder
- test
-
diff --git a/applications/sink/rabbit-sink/README.adoc b/applications/sink/rabbit-sink/README.adoc
index 70a8cc00e..93b26eae7 100644
--- a/applications/sink/rabbit-sink/README.adoc
+++ b/applications/sink/rabbit-sink/README.adoc
@@ -13,14 +13,13 @@ The **$$rabbit$$** $$sink$$ has the following options:
Properties grouped by prefix:
-=== rabbit
+=== rabbit.consumer
$$converter-bean-name$$:: $$The bean name for a custom message converter; if omitted, a SimpleMessageConverter is used. If 'jsonConverter', a Jackson2JsonMessageConverter bean will be created for you.$$ *($$String$$, default: `$$$$`)*
$$exchange$$:: $$Exchange name - overridden by exchangeNameExpression, if supplied.$$ *($$String$$, default: `$$$$`)*
$$exchange-expression$$:: $$A SpEL expression that evaluates to an exchange name.$$ *($$Expression$$, default: `$$$$`)*
-$$headers-mapped-last$$:: $$When mapping headers for the outbound message, determine whether the headers are mapped before the message is converted, or afterwards.$$ *($$Boolean$$, default: `$$true$$`)*
+$$headers-mapped-last$$:: $$When mapping headers for the outbound message, determine whether the headers are mapped before the message is converted, or afterward.$$ *($$Boolean$$, default: `$$true$$`)*
$$mapped-request-headers$$:: $$Headers that will be mapped.$$ *($$String[]$$, default: `$$[*]$$`)*
-$$own-connection$$:: $$When true, use a separate connection based on the boot properties.$$ *($$Boolean$$, default: `$$false$$`)*
$$persistent-delivery-mode$$:: $$Default delivery mode when 'amqp_deliveryMode' header is not present, true for PERSISTENT.$$ *($$Boolean$$, default: `$$false$$`)*
$$routing-key$$:: $$Routing key - overridden by routingKeyExpression, if supplied.$$ *($$String$$, default: `$$$$`)*
$$routing-key-expression$$:: $$A SpEL expression that evaluates to a routing key.$$ *($$Expression$$, default: `$$$$`)*
@@ -32,6 +31,7 @@ $$addresses$$:: $$Comma-separated list of addresses to which the client should c
$$channel-rpc-timeout$$:: $$Continuation timeout for RPC calls in channels. Set it to zero to wait forever.$$ *($$Duration$$, default: `$$10m$$`)*
$$connection-timeout$$:: $$Connection timeout. Set it to zero to wait forever.$$ *($$Duration$$, default: `$$$$`)*
$$host$$:: $$RabbitMQ host. Ignored if an address is set.$$ *($$String$$, default: `$$localhost$$`)*
+$$max-inbound-message-body-size$$:: $$Maximum size of the body of inbound (received) messages.$$ *($$DataSize$$, default: `$$64MB$$`)*
$$password$$:: $$Login to authenticate against the broker.$$ *($$String$$, default: `$$guest$$`)*
$$port$$:: $$RabbitMQ port. Ignored if an address is set. Default to 5672, or 5671 if SSL is enabled.$$ *($$Integer$$, default: `$$$$`)*
$$publisher-confirm-type$$:: $$Type of publisher confirms to use.$$ *($$ConfirmType$$, default: `$$$$`, possible values: `SIMPLE`,`CORRELATED`,`NONE`)*
diff --git a/applications/sink/rabbit-sink/pom.xml b/applications/sink/rabbit-sink/pom.xml
index 36aa64f6d..50e248ef1 100644
--- a/applications/sink/rabbit-sink/pom.xml
+++ b/applications/sink/rabbit-sink/pom.xml
@@ -27,14 +27,9 @@
org.springframework.cloud.fn
- rabbit-consumer
+ spring-rabbit-consumer
- org.springframework.cloud
- spring-cloud-stream-test-binder
- test
-
-
org.testcontainers
testcontainers
${testcontainers.version}
@@ -76,13 +71,13 @@
rabbit
sink
${project.version}
- org.springframework.cloud.fn.consumer.rabbit.RabbitConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ rabbitConsumer
org.springframework.cloud.fn
- rabbit-consumer
+ spring-rabbit-consumer
diff --git a/applications/sink/redis-sink/README.adoc b/applications/sink/redis-sink/README.adoc
index ea07a136e..7ecf5ce2f 100644
--- a/applications/sink/redis-sink/README.adoc
+++ b/applications/sink/redis-sink/README.adoc
@@ -14,11 +14,11 @@ Properties grouped by prefix:
=== redis.consumer
$$key$$:: $$A literal key name to use when storing to a key.$$ *($$String$$, default: `$$$$`)*
-$$key-expression$$:: $$A SpEL expression to use for storing to a key.$$ *($$String$$, default: `$$$$`)*
+$$key-expression$$:: $$A SpEL expression to use for storing to a key.$$ *($$Expression$$, default: `$$$$`)*
$$queue$$:: $$A literal queue name to use when storing in a queue.$$ *($$String$$, default: `$$$$`)*
-$$queue-expression$$:: $$A SpEL expression to use for queue.$$ *($$String$$, default: `$$$$`)*
+$$queue-expression$$:: $$A SpEL expression to use for queue.$$ *($$Expression$$, default: `$$$$`)*
$$topic$$:: $$A literal topic name to use when publishing to a topic.$$ *($$String$$, default: `$$$$`)*
-$$topic-expression$$:: $$A SpEL expression to use for topic.$$ *($$String$$, default: `$$$$`)*
+$$topic-expression$$:: $$A SpEL expression to use for topic.$$ *($$Expression$$, default: `$$$$`)*
=== spring.data.redis
diff --git a/applications/sink/redis-sink/pom.xml b/applications/sink/redis-sink/pom.xml
index eb9585b7a..1bff70bb5 100644
--- a/applications/sink/redis-sink/pom.xml
+++ b/applications/sink/redis-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- redis-consumer
+ spring-redis-consumer
@@ -79,13 +79,13 @@
redis
sink
${project.version}
- org.springframework.cloud.fn.consumer.redis.RedisConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ redisConsumer
org.springframework.cloud.fn
- redis-consumer
+ spring-redis-consumer
diff --git a/applications/sink/redis-sink/src/test/java/org/springframework/cloud/stream/app/sink/redis/RedisSinkTests.java b/applications/sink/redis-sink/src/test/java/org/springframework/cloud/stream/app/sink/redis/RedisSinkTests.java
index f89e7c3d8..7c05255cf 100644
--- a/applications/sink/redis-sink/src/test/java/org/springframework/cloud/stream/app/sink/redis/RedisSinkTests.java
+++ b/applications/sink/redis-sink/src/test/java/org/springframework/cloud/stream/app/sink/redis/RedisSinkTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2020-2022 the original author or authors.
+ * Copyright 2020-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,12 +24,10 @@
import org.springframework.boot.WebApplicationType;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
-import org.springframework.cloud.fn.consumer.redis.RedisConsumerConfiguration;
import org.springframework.cloud.fn.consumer.redis.RedisTestContainerSupport;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ConfigurableApplicationContext;
-import org.springframework.context.annotation.Import;
import org.springframework.data.redis.core.StringRedisTemplate;
import org.springframework.data.redis.support.collections.DefaultRedisList;
import org.springframework.data.redis.support.collections.RedisList;
@@ -47,32 +45,30 @@ public class RedisSinkTests implements RedisTestContainerSupport {
@Test
public void testRedisSink() {
+ String key = "foo";
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(
TestChannelBinderConfiguration
.getCompleteConfiguration(RedisSinkTestApplication.class))
.web(WebApplicationType.NONE)
.run("--spring.cloud.function.definition=redisConsumer",
+ "--spring.cloud.stream.bindings.redisConsumer-in-0.consumer.use-native-decoding=true",
"--spring.data.redis.url=" + RedisTestContainerSupport.getUri(),
- "--redis.consumer.key=foo")) {
+ "--redis.consumer.key=" + key)) {
- //Setup
- String key = "foo";
-
- final StringRedisTemplate redisTemplate = context.getBean(StringRedisTemplate.class);
+ StringRedisTemplate redisTemplate = context.getBean(StringRedisTemplate.class);
redisTemplate.delete(key);
- RedisList redisList = new DefaultRedisList<>(key, redisTemplate);
List list = new ArrayList<>();
list.add("Manny");
list.add("Moe");
list.add("Jack");
- //Execute
Message> message = new GenericMessage<>(list);
InputDestination source = context.getBean(InputDestination.class);
source.send(message);
+ RedisList redisList = new DefaultRedisList<>(key, redisTemplate);
assertThat(redisList.size()).isEqualTo(3);
assertThat(redisList.get(0)).isEqualTo("Manny");
assertThat(redisList.get(1)).isEqualTo("Moe");
@@ -81,7 +77,6 @@ public void testRedisSink() {
}
@SpringBootApplication
- @Import(RedisConsumerConfiguration.class)
public static class RedisSinkTestApplication {
}
}
diff --git a/applications/sink/router-sink/pom.xml b/applications/sink/router-sink/pom.xml
index 0c1ece0e2..bd92aeb66 100644
--- a/applications/sink/router-sink/pom.xml
+++ b/applications/sink/router-sink/pom.xml
@@ -54,7 +54,7 @@
org.springframework.cloud.fn
- payload-converter-function
+ spring-payload-converter-function
org.springframework.boot
diff --git a/applications/sink/router-sink/src/main/java/org/springframework/cloud/stream/app/sink/router/RouterSinkConfiguration.java b/applications/sink/router-sink/src/main/java/org/springframework/cloud/stream/app/sink/router/RouterSinkConfiguration.java
index 836dde298..6ecf7423f 100644
--- a/applications/sink/router-sink/src/main/java/org/springframework/cloud/stream/app/sink/router/RouterSinkConfiguration.java
+++ b/applications/sink/router-sink/src/main/java/org/springframework/cloud/stream/app/sink/router/RouterSinkConfiguration.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2016-2023 the original author or authors.
+ * Copyright 2016-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -36,6 +36,7 @@
import org.springframework.integration.router.AbstractMessageRouter;
import org.springframework.integration.router.ExpressionEvaluatingRouter;
import org.springframework.integration.router.MethodInvokingRouter;
+import org.springframework.integration.scripting.dsl.ScriptSpec;
import org.springframework.integration.scripting.dsl.Scripts;
import org.springframework.lang.Nullable;
import org.springframework.messaging.Message;
@@ -93,12 +94,11 @@ public AbstractMessageRouter router(BindingService bindingService, StreamBridge
@Bean
@ConditionalOnProperty("router.script")
- public MessageProcessor> scriptProcessor() {
+ public ScriptSpec scriptProcessor() {
return Scripts.processor(this.properties.getScript())
.lang("groovy")
.refreshCheckDelay(this.properties.getRefreshDelay())
- .variables(obtainScriptVariables(this.properties))
- .get();
+ .variables(obtainScriptVariables(this.properties));
}
private static Map obtainScriptVariables(RouterSinkProperties properties) {
diff --git a/applications/sink/rsocket-sink/pom.xml b/applications/sink/rsocket-sink/pom.xml
index 7aaebde95..47a772029 100644
--- a/applications/sink/rsocket-sink/pom.xml
+++ b/applications/sink/rsocket-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- rsocket-consumer
+ spring-rsocket-consumer
io.projectreactor
@@ -53,12 +53,13 @@
rsocket
sink
${project.version}
- org.springframework.cloud.fn.consumer.rsocket.RsocketConsumerConfiguration.class
+ AUTOCONFIGURATION
+ rsocketConsumer
org.springframework.cloud.fn
- rsocket-consumer
+ spring-rsocket-consumer
diff --git a/applications/sink/rsocket-sink/src/test/java/org/springframework/cloud/stream/app/rsocket/sink/RSocketSinkTests.java b/applications/sink/rsocket-sink/src/test/java/org/springframework/cloud/stream/app/rsocket/sink/RSocketSinkTests.java
index 185166e91..8524eaa67 100644
--- a/applications/sink/rsocket-sink/src/test/java/org/springframework/cloud/stream/app/rsocket/sink/RSocketSinkTests.java
+++ b/applications/sink/rsocket-sink/src/test/java/org/springframework/cloud/stream/app/rsocket/sink/RSocketSinkTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2020-2020 the original author or authors.
+ * Copyright 2020-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,13 +16,11 @@
package org.springframework.cloud.stream.app.rsocket.sink;
-import java.util.function.Function;
+import java.time.Duration;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
-import reactor.core.publisher.Flux;
-import reactor.core.publisher.Mono;
-import reactor.core.publisher.ReplayProcessor;
+import reactor.core.publisher.Sinks;
import reactor.test.StepVerifier;
import org.springframework.beans.factory.annotation.Autowired;
@@ -36,11 +34,9 @@
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ApplicationContext;
-import org.springframework.context.annotation.Import;
import org.springframework.integration.support.MessageBuilder;
import org.springframework.messaging.Message;
import org.springframework.messaging.handler.annotation.MessageMapping;
-import org.springframework.messaging.support.GenericMessage;
import org.springframework.stereotype.Controller;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.util.ReflectionTestUtils;
@@ -48,7 +44,7 @@
/**
* @author Soby Chacko
*/
-@SpringBootTest(properties = {"spring.rsocket.server.port=0"}, classes = RSocketSinkTests.RSocketserverApplication.class)
+@SpringBootTest(properties = {"spring.rsocket.server.port=0"}, classes = RSocketSinkTests.RSocketServerApplication.class)
@DirtiesContext
public class RSocketSinkTests {
@@ -71,15 +67,11 @@ void testRsocketConsumer() {
final int port = server.address().getPort();
applicationContextRunner.withPropertyValues(
- "spring.cloud.function.definition=rsocketConsumer",
- "rsocket.consumer.port=" + port,
- "rsocket.consumer.route=test-route")
+ "spring.cloud.function.definition=rsocketFunctionConsumer",
+ "rsocket.consumer.port=" + port,
+ "rsocket.consumer.route=test-route")
.run(context -> {
- Function>, Mono> rsocketConsumer = context.getBean("rsocketConsumer", Function.class);
- rsocketConsumer.apply(Flux.just(new GenericMessage<>("Hello RSocket")))
- .subscribe();
-
- final StepVerifier stepVerifier = StepVerifier.create(RSocketserverApplication.fireForgetPayloads)
+ final StepVerifier stepVerifier = StepVerifier.create(RSocketServerApplication.fireForgetPayloads.asMono())
.expectNext("Hello RSocket")
.thenCancel()
.verifyLater();
@@ -88,25 +80,24 @@ void testRsocketConsumer() {
InputDestination source = context.getBean(InputDestination.class);
source.send(message);
- stepVerifier.verify();
+ stepVerifier.verify(Duration.ofSeconds(10));
});
}
- @EnableAutoConfiguration
+ @EnableAutoConfiguration(exclude = RsocketConsumerConfiguration.class)
@SpringBootConfiguration
@Controller
- static class RSocketserverApplication {
- static final ReplayProcessor fireForgetPayloads = ReplayProcessor.create();
+ static class RSocketServerApplication {
+ static final Sinks.One fireForgetPayloads = Sinks.one();
@MessageMapping("test-route")
void someMethod(String payload) {
- this.fireForgetPayloads.onNext(payload);
+ this.fireForgetPayloads.tryEmitValue(payload);
}
}
@EnableAutoConfiguration
@SpringBootConfiguration
- @Import(RsocketConsumerConfiguration.class)
static class RsocketSinkTestApplication {
}
diff --git a/applications/sink/s3-sink/README.adoc b/applications/sink/s3-sink/README.adoc
index dbf1bc2d7..6ec4d5037 100644
--- a/applications/sink/s3-sink/README.adoc
+++ b/applications/sink/s3-sink/README.adoc
@@ -44,6 +44,7 @@ $$static$$:: $$$$ *($$String$$, default: `$$$$`)*
$$accelerate-mode-enabled$$:: $$Option to enable using the accelerate endpoint when accessing S3. Accelerate endpoints allow faster transfer of objects by using Amazon CloudFront's globally distributed edge locations.$$ *($$Boolean$$, default: `$$$$`)*
$$checksum-validation-enabled$$:: $$Option to disable doing a validation of the checksum of an object stored in S3.$$ *($$Boolean$$, default: `$$$$`)*
$$chunked-encoding-enabled$$:: $$Option to enable using chunked encoding when signing the request payload for {@link software.amazon.awssdk.services.s3.model.PutObjectRequest} and {@link software.amazon.awssdk.services.s3.model.UploadPartRequest}.$$ *($$Boolean$$, default: `$$$$`)*
+$$cross-region-enabled$$:: $$Enables cross-region bucket access.$$ *($$Boolean$$, default: `$$$$`)*
$$endpoint$$:: $$Overrides the default endpoint.$$ *($$URI$$, default: `$$$$`)*
$$path-style-access-enabled$$:: $$Option to enable using path style access for accessing S3 objects instead of DNS style access. DNS style access is preferred as it will result in better load balancing when accessing S3.$$ *($$Boolean$$, default: `$$$$`)*
$$region$$:: $$Overrides the default region.$$ *($$String$$, default: `$$$$`)*
diff --git a/applications/sink/s3-sink/pom.xml b/applications/sink/s3-sink/pom.xml
index c015855bc..4d75ce7bf 100644
--- a/applications/sink/s3-sink/pom.xml
+++ b/applications/sink/s3-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- s3-consumer
+ spring-s3-consumer
org.springframework.integration
@@ -58,13 +58,13 @@
s3
sink
${project.version}
- org.springframework.cloud.fn.consumer.s3.AwsS3ConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ s3Consumer
org.springframework.cloud.fn
- s3-consumer
+ spring-s3-consumer
diff --git a/applications/sink/s3-sink/src/test/java/org/springframework/cloud/stream/app/s3/sink/AwsS3SinkTests.java b/applications/sink/s3-sink/src/test/java/org/springframework/cloud/stream/app/s3/sink/AwsS3SinkTests.java
index f47c29540..24be6bb9e 100644
--- a/applications/sink/s3-sink/src/test/java/org/springframework/cloud/stream/app/s3/sink/AwsS3SinkTests.java
+++ b/applications/sink/s3-sink/src/test/java/org/springframework/cloud/stream/app/s3/sink/AwsS3SinkTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2016-2023 the original author or authors.
+ * Copyright 2016-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@
package org.springframework.cloud.stream.app.s3.sink;
import java.io.File;
+import java.nio.ByteBuffer;
import java.nio.file.Path;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CountDownLatch;
@@ -40,7 +41,7 @@
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.test.context.SpringBootTest;
-import org.springframework.cloud.fn.consumer.s3.AwsS3ConsumerConfiguration;
+import org.springframework.boot.test.mock.mockito.MockBean;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.annotation.Bean;
@@ -54,7 +55,6 @@
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.BDDMockito.willReturn;
import static org.mockito.Mockito.atLeastOnce;
-import static org.mockito.Mockito.spy;
import static org.mockito.Mockito.verify;
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE,
@@ -62,6 +62,7 @@
"spring.cloud.aws.credentials.accessKey=" + AwsS3SinkTests.AWS_ACCESS_KEY,
"spring.cloud.aws.credentials.secretKey=" + AwsS3SinkTests.AWS_SECRET_KEY,
"spring.cloud.aws.region.static=" + AwsS3SinkTests.AWS_REGION,
+ "spring.cloud.aws.s3.endpoint=s3://test.endpoint",
"s3.consumer.bucket=" + AwsS3SinkTests.S3_BUCKET,
"s3.consumer.acl=PUBLIC_READ_WRITE"})
@DirtiesContext
@@ -78,7 +79,7 @@ public class AwsS3SinkTests {
@TempDir
protected static Path temporaryRemoteFolder;
- @Autowired
+ @MockBean
private S3AsyncClient amazonS3;
@Autowired
@@ -95,8 +96,6 @@ public class AwsS3SinkTests {
@BeforeEach
public void setupTest() {
- S3AsyncClient amazonS3 = spy(this.amazonS3);
-
willReturn(CompletableFuture.completedFuture(PutObjectResponse.builder().build()))
.given(amazonS3)
.putObject(any(PutObjectRequest.class), any(AsyncRequestBody.class));
@@ -133,7 +132,7 @@ public void testS3SinkWithBinderBasic() throws Exception {
AsyncRequestBody asyncRequestBody = asyncRequestBodyArgumentCaptor.getValue();
StepVerifier.create(asyncRequestBody)
- .assertNext(buffer -> assertThat(buffer.array()).isEmpty())
+ .assertNext(buffer -> assertThat(buffer).isEqualTo(ByteBuffer.allocate(0)))
.expectComplete()
.verify();
@@ -141,7 +140,7 @@ public void testS3SinkWithBinderBasic() throws Exception {
}
@SpringBootApplication
- @Import({AwsS3ConsumerConfiguration.class, TestChannelBinderConfiguration.class})
+ @Import(TestChannelBinderConfiguration.class)
public static class SampleConfiguration {
@Bean
diff --git a/applications/sink/sftp-sink/pom.xml b/applications/sink/sftp-sink/pom.xml
index 71ec07dff..9f05bf73b 100644
--- a/applications/sink/sftp-sink/pom.xml
+++ b/applications/sink/sftp-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- sftp-consumer
+ spring-sftp-consumer
@@ -43,13 +43,13 @@
sftp
sink
${project.version}
- org.springframework.cloud.fn.consumer.sftp.SftpConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ sftpConsumer
org.springframework.cloud.fn
- sftp-consumer
+ spring-sftp-consumer
diff --git a/applications/sink/tcp-sink/README.adoc b/applications/sink/tcp-sink/README.adoc
index 5f940b246..d0f1c51d3 100644
--- a/applications/sink/tcp-sink/README.adoc
+++ b/applications/sink/tcp-sink/README.adoc
@@ -23,11 +23,11 @@ $$host$$:: $$The host to which this sink will connect.$$ *($$String$$, default:
=== tcp
-$$nio$$:: $$Whether or not to use NIO.$$ *($$Boolean$$, default: `$$false$$`)*
+$$nio$$:: $$Whether to use NIO.$$ *($$Boolean$$, default: `$$false$$`)*
$$port$$:: $$The port on which to listen; 0 for the OS to choose a port.$$ *($$Integer$$, default: `$$1234$$`)*
$$reverse-lookup$$:: $$Perform a reverse DNS lookup on the remote IP Address; if false, just the IP address is included in the message headers.$$ *($$Boolean$$, default: `$$false$$`)*
$$socket-timeout$$:: $$The timeout (ms) before closing the socket when no data is received.$$ *($$Integer$$, default: `$$120000$$`)*
-$$use-direct-buffers$$:: $$Whether or not to use direct buffers.$$ *($$Boolean$$, default: `$$false$$`)*
+$$use-direct-buffers$$:: $$Whether to use direct buffers.$$ *($$Boolean$$, default: `$$false$$`)*
//end::configuration-properties[]
== Available Encoders
diff --git a/applications/sink/tcp-sink/pom.xml b/applications/sink/tcp-sink/pom.xml
index 93955b11d..1a1551a59 100644
--- a/applications/sink/tcp-sink/pom.xml
+++ b/applications/sink/tcp-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- tcp-consumer
+ spring-tcp-consumer
@@ -43,12 +43,13 @@
tcp
sink
${project.version}
- org.springframework.cloud.fn.consumer.tcp.TcpConsumerConfiguration.class
+ AUTOCONFIGURATION
+ tcpConsumer
org.springframework.cloud.fn
- tcp-consumer
+ spring-tcp-consumer
diff --git a/applications/sink/throughput-sink/README.adoc b/applications/sink/throughput-sink/README.adoc
index b7589fa8b..1a077a493 100644
--- a/applications/sink/throughput-sink/README.adoc
+++ b/applications/sink/throughput-sink/README.adoc
@@ -8,7 +8,7 @@ Sink that will count messages and log the observed throughput at a selected inte
The **$$throughput$$** $$sink$$ has the following options:
//tag::configuration-properties[]
-$$throughput.report-every-ms$$:: $$how often to report.$$ *($$Integer$$, default: `$$$$`)*
+$$throughput.report-every-ms$$:: $$how often to report.$$ *($$Integer$$, default: `$$1000$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/sink/twitter-message-sink/pom.xml b/applications/sink/twitter-message-sink/pom.xml
index fbf1f4cf0..1d456076b 100644
--- a/applications/sink/twitter-message-sink/pom.xml
+++ b/applications/sink/twitter-message-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- twitter-consumer
+ spring-twitter-consumer
org.awaitility
@@ -33,13 +33,6 @@
org.mock-server
mockserver-netty
- ${mockserver.version}
- test
-
-
- org.mock-server
- mockserver-client-java
- ${mockserver.version}
test
@@ -66,14 +59,14 @@
twitter-message
sink
${project.version}
- org.springframework.cloud.fn.consumer.twitter.message.TwitterMessageConsumerConfiguration.class
- byteArrayTextToString|sendDirectMessageConsumer
+ AUTOCONFIGURATION
+ byteArrayTextToString|twitterSendMessageConsumer
org.springframework.cloud.fn
- twitter-consumer
+ spring-twitter-consumer
diff --git a/applications/sink/twitter-message-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/message/TwitterMessageSinkIntegrationTests.java b/applications/sink/twitter-message-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/message/TwitterMessageSinkIntegrationTests.java
index 736febcd2..e09e924a0 100644
--- a/applications/sink/twitter-message-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/message/TwitterMessageSinkIntegrationTests.java
+++ b/applications/sink/twitter-message-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/message/TwitterMessageSinkIntegrationTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2020-2020 the original author or authors.
+ * Copyright 2020-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -36,15 +36,12 @@
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.cloud.fn.common.twitter.TwitterConnectionProperties;
import org.springframework.cloud.fn.common.twitter.util.TwitterTestUtils;
-import org.springframework.cloud.fn.consumer.twitter.message.TwitterMessageConsumerConfiguration;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Primary;
import org.springframework.messaging.support.GenericMessage;
-import org.springframework.test.util.TestSocketUtils;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockserver.matchers.Times.unlimited;
@@ -54,21 +51,18 @@
/**
* @author Christian Tzolov
+ * @author Artem Bilan
*/
public class TwitterMessageSinkIntegrationTests {
- private static final String MOCK_SERVER_IP = "127.0.0.1";
-
- private static final Integer MOCK_SERVER_PORT = TestSocketUtils.findAvailableTcpPort();
-
private static ClientAndServer mockServer;
private static MockServerClient mockClient;
@BeforeEach
public void startMockServer() {
- mockServer = ClientAndServer.startClientAndServer(MOCK_SERVER_PORT);
- mockClient = new MockServerClient(MOCK_SERVER_IP, MOCK_SERVER_PORT);
+ mockServer = ClientAndServer.startClientAndServer();
+ mockClient = new MockServerClient("localhost", mockServer.getPort());
mockClient
.when(
@@ -111,7 +105,7 @@ public void directMessageScreenName() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(
TestChannelBinderConfiguration.getCompleteConfiguration(TestTwitterMessageSinkApplication.class))
.web(WebApplicationType.NONE)
- .run("--spring.cloud.function.definition=byteArrayTextToString|sendDirectMessageConsumer",
+ .run("--spring.cloud.function.definition=byteArrayTextToString|twitterSendMessageConsumer",
"--twitter.message.update.screenName='user666'",
@@ -149,7 +143,7 @@ public void directMessageUserId() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(
TestChannelBinderConfiguration.getCompleteConfiguration(TestTwitterMessageSinkApplication.class))
.web(WebApplicationType.NONE)
- .run("--spring.cloud.function.definition=byteArrayTextToString|sendDirectMessageConsumer",
+ .run("--spring.cloud.function.definition=byteArrayTextToString|twitterSendMessageConsumer",
"--twitter.message.update.userId='1075751718749659136'",
@@ -178,7 +172,7 @@ public void directMessageDefaults() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(
TestChannelBinderConfiguration.getCompleteConfiguration(TestTwitterMessageSinkApplication.class))
.web(WebApplicationType.NONE)
- .run("--spring.cloud.function.definition=byteArrayTextToString|sendDirectMessageConsumer",
+ .run("--spring.cloud.function.definition=byteArrayTextToString|twitterSendMessageConsumer",
"--twitter.message.update.userId=headers['user']",
"--twitter.message.update.text=payload.concat(\" with suffix \")",
@@ -192,8 +186,8 @@ public void directMessageDefaults() {
InputDestination source = context.getBean(InputDestination.class);
assertThat(source).isNotNull();
- Map headers = Collections.singletonMap("user", "1075751718749659136");
- source.send(new GenericMessage("hello".getBytes(StandardCharsets.UTF_8), headers));
+ Map headers = Collections.singletonMap("user", "1075751718749659136");
+ source.send(new GenericMessage<>("hello".getBytes(StandardCharsets.UTF_8), headers));
mockClient.verify(request()
.withMethod("POST")
@@ -208,7 +202,6 @@ public void directMessageDefaults() {
@SpringBootConfiguration
@EnableAutoConfiguration
- @Import(TwitterMessageConsumerConfiguration.class)
public static class TestTwitterMessageSinkApplication {
@Bean
@@ -218,10 +211,11 @@ public twitter4j.conf.Configuration twitterConfiguration2(TwitterConnectionPrope
Function mockedConfiguration =
toConfigurationBuilder.andThen(
- new TwitterTestUtils().mockTwitterUrls(
- String.format("http://%s:%s", MOCK_SERVER_IP, MOCK_SERVER_PORT)));
+ new TwitterTestUtils().mockTwitterUrls("http://localhost:" + mockServer.getPort()));
return mockedConfiguration.apply(properties).build();
}
+
}
+
}
diff --git a/applications/sink/twitter-update-sink/README.adoc b/applications/sink/twitter-update-sink/README.adoc
index 076390d22..22a4cef01 100644
--- a/applications/sink/twitter-update-sink/README.adoc
+++ b/applications/sink/twitter-update-sink/README.adoc
@@ -21,9 +21,9 @@ Properties grouped by prefix:
=== twitter.update
-$$attachment-url$$:: $$(SpEL expression) In order for a URL to not be counted in the text body of an extended Tweet, provide a URL as a Tweet attachment. This URL must be a Tweet permalink, or Direct Message deep link. Arbitrary, non-Twitter URLs must remain in the text text. URLs passed to the attachment_url parameter not matching either a Tweet permalink or Direct Message deep link will fail at Tweet creation and cause an exception.$$ *($$Expression$$, default: `$$$$`)*
-$$display-coordinates$$:: $$(SpEL expression) Whether or not to put a pin on the exact coordinates a Tweet has been sent from.$$ *($$Expression$$, default: `$$$$`)*
-$$in-reply-to-status-id$$:: $$(SpEL expression) The ID of an existing text that the update is in reply to. Note: This parameter will be ignored unless the author of the Tweet this parameter references is mentioned within the text text. Therefore, you must include @username, where username is the author of the referenced Tweet, within the update. When inReplyToStatusId is set the auto_populate_reply_metadata is automatically set as well. Later ensures that leading @mentions will be looked up from the original Tweet, and added to the new Tweet from there. This wil append @mentions into the metadata of an extended Tweet as a reply chain grows, until the limit on @mentions is reached. In cases where the original Tweet has been deleted, the reply will fail.$$ *($$Expression$$, default: `$$$$`)*
+$$attachment-url$$:: $$(SpEL expression) In order for a URL to not be counted in the text body of an extended Tweet, provide a URL as a Tweet attachment. This URL must be a Tweet permalink, or Direct Message deep link. Arbitrary, non-Twitter URLs must remain in the text. URLs passed to the attachment_url parameter not matching either a Tweet permalink or Direct Message deep link will fail at Tweet creation and cause an exception.$$ *($$Expression$$, default: `$$$$`)*
+$$display-coordinates$$:: $$(SpEL expression) Whether to put a pin on the exact coordinates a Tweet has been sent from.$$ *($$Expression$$, default: `$$$$`)*
+$$in-reply-to-status-id$$:: $$(SpEL expression) The ID of an existing text that the update is in reply to. Note: This parameter will be ignored unless the author of the Tweet this parameter references is mentioned within the text. Therefore, you must include @username, where username is the author of the referenced Tweet, within the update. When inReplyToStatusId is set the auto_populate_reply_metadata is automatically set as well. Later ensures that leading @mentions will be looked up from the original Tweet, and added to the new Tweet from there. This wil append @mentions into the metadata of an extended Tweet as a reply chain grows, until the limit on @mentions is reached. In cases where the original Tweet has been deleted, the reply will fail.$$ *($$Expression$$, default: `$$$$`)*
$$media-ids$$:: $$(SpEL expression) A comma-delimited list of media_ids to associate with the Tweet. You may include up to 4 photos or 1 animated GIF or 1 video in a Tweet. See Uploading Media for further details on uploading media.$$ *($$Expression$$, default: `$$$$`)*
$$place-id$$:: $$(SpEL expression) A place in the world.$$ *($$Expression$$, default: `$$$$`)*
$$text$$:: $$(SpEL expression) The text of the text update. URL encode as necessary. t.co link wrapping will affect character counts. Defaults to message's payload$$ *($$Expression$$, default: `$$payload$$`)*
@@ -31,7 +31,7 @@ $$text$$:: $$(SpEL expression) The text of the text update. URL encode as necess
=== twitter.update.location
$$lat$$:: $$The latitude of the location this Tweet refers to. This parameter will be ignored unless it is inside the range -90.0 to +90.0 (North is positive) inclusive. It will also be ignored if there is no corresponding long parameter.$$ *($$Expression$$, default: `$$$$`)*
-$$lon$$:: $$The longitude of the location this Tweet refers to. The valid ranges for longitude are -180.0 to +180.0 (East is positive) inclusive. This parameter will be ignored if outside that range, if it is not a number, if geo_enabled is disabled, or if there no corresponding lat parameter.$$ *($$Expression$$, default: `$$$$`)*
+$$lon$$:: $$The longitude of the location this Tweet refers to. The valid ranges for longitude are -180.0 to +180.0 (East is positive) inclusive. This parameter will be ignored if outside that range, if it is not a number, if geo_enabled is disabled, or if there is no corresponding lat parameter.$$ *($$Expression$$, default: `$$$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/sink/twitter-update-sink/pom.xml b/applications/sink/twitter-update-sink/pom.xml
index b21721fb8..e150c7b40 100644
--- a/applications/sink/twitter-update-sink/pom.xml
+++ b/applications/sink/twitter-update-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- twitter-consumer
+ spring-twitter-consumer
org.awaitility
@@ -33,13 +33,6 @@
org.mock-server
mockserver-netty
- ${mockserver.version}
- test
-
-
- org.mock-server
- mockserver-client-java
- ${mockserver.version}
test
@@ -66,14 +59,14 @@
twitter-update
sink
${project.version}
- org.springframework.cloud.fn.consumer.twitter.status.update.TwitterUpdateConsumerConfiguration.class
+ AUTOCONFIGURATION
byteArrayTextToString|twitterStatusUpdateConsumer
org.springframework.cloud.fn
- twitter-consumer
+ spring-twitter-consumer
diff --git a/applications/sink/twitter-update-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/update/TwitterUpdateSinkIntegrationTests.java b/applications/sink/twitter-update-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/update/TwitterUpdateSinkIntegrationTests.java
index 9ed84dd7c..9b5e27a1a 100644
--- a/applications/sink/twitter-update-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/update/TwitterUpdateSinkIntegrationTests.java
+++ b/applications/sink/twitter-update-sink/src/test/java/org/springframework/cloud/stream/app/sink/twitter/update/TwitterUpdateSinkIntegrationTests.java
@@ -17,7 +17,6 @@
package org.springframework.cloud.stream.app.sink.twitter.update;
import java.nio.charset.StandardCharsets;
-import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import org.junit.jupiter.api.AfterAll;
@@ -34,16 +33,13 @@
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.cloud.fn.common.twitter.TwitterConnectionProperties;
import org.springframework.cloud.fn.common.twitter.util.TwitterTestUtils;
-import org.springframework.cloud.fn.consumer.twitter.status.update.TwitterUpdateConsumerConfiguration;
import org.springframework.cloud.fn.consumer.twitter.status.update.TwitterUpdateConsumerProperties;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Primary;
import org.springframework.messaging.support.GenericMessage;
-import org.springframework.test.util.TestSocketUtils;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockserver.matchers.Times.unlimited;
@@ -53,21 +49,19 @@
/**
* @author Christian Tzolov
+ * @author Artem Bilan
*/
public class TwitterUpdateSinkIntegrationTests {
- private static final String MOCK_SERVER_IP = "127.0.0.1";
-
- private static final Integer MOCK_SERVER_PORT = TestSocketUtils.findAvailableTcpPort();
-
private static ClientAndServer mockServer;
private static MockServerClient mockClient;
@BeforeAll
public static void startMockServer() {
- mockServer = ClientAndServer.startClientAndServer(MOCK_SERVER_PORT);
- mockClient = new MockServerClient(MOCK_SERVER_IP, MOCK_SERVER_PORT);
+ mockServer = ClientAndServer.startClientAndServer();
+ mockClient = new MockServerClient("localhost", mockServer.getPort());
+
mockClient
.when(
request().withMethod("POST").withPath("/statuses/update.json"),
@@ -75,8 +69,7 @@ public static void startMockServer() {
.respond(response()
.withStatusCode(200)
.withHeader("Content-Type", "application/json; charset=utf-8")
- .withBody(TwitterTestUtils.asString("classpath:/response/update_test_1.json"))
- .withDelay(TimeUnit.SECONDS, 1));
+ .withBody(TwitterTestUtils.asString("classpath:/response/update_test_1.json")));
}
@AfterAll
@@ -211,7 +204,6 @@ public void updateWithAllParams() {
@SpringBootConfiguration
@EnableAutoConfiguration
- @Import(TwitterUpdateConsumerConfiguration.class)
public static class TestTwitterUpdateSinkApplication {
@Bean
@@ -221,10 +213,11 @@ public twitter4j.conf.Configuration twitterConfiguration2(TwitterConnectionPrope
Function mockedConfiguration =
toConfigurationBuilder.andThen(
- new TwitterTestUtils().mockTwitterUrls(
- String.format("http://%s:%s", MOCK_SERVER_IP, MOCK_SERVER_PORT)));
+ new TwitterTestUtils().mockTwitterUrls("http://localhost:" + mockServer.getPort()));
return mockedConfiguration.apply(properties).build();
}
+
}
+
}
diff --git a/applications/sink/wavefront-sink/pom.xml b/applications/sink/wavefront-sink/pom.xml
index b862c686f..77686418b 100644
--- a/applications/sink/wavefront-sink/pom.xml
+++ b/applications/sink/wavefront-sink/pom.xml
@@ -17,11 +17,11 @@
org.springframework.cloud.fn
- wavefront-consumer
+ spring-wavefront-consumer
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
test
@@ -48,16 +48,14 @@
wavefront
sink
${project.version}
-
- org.springframework.cloud.fn.consumer.wavefront.WavefrontConsumerConfiguration.class
-
+ AUTOCONFIGURATION
wavefrontConsumer
org.springframework.cloud.fn
- wavefront-consumer
+ spring-wavefront-consumer
diff --git a/applications/sink/websocket-sink/README.adoc b/applications/sink/websocket-sink/README.adoc
index 9a99b02ee..7032ae0c2 100644
--- a/applications/sink/websocket-sink/README.adoc
+++ b/applications/sink/websocket-sink/README.adoc
@@ -7,11 +7,11 @@ A simple Websocket Sink implementation.
The following options are supported:
//tag::configuration-properties[]
-$$websocket.consumer.log-level$$:: $$the logLevel for netty channels. Default is WARN$$ *($$String$$, default: `$$$$`)*
-$$websocket.consumer.path$$:: $$the path on which a WebsocketSink consumer needs to connect. Default is /websocket$$ *($$String$$, default: `$$/websocket$$`)*
-$$websocket.consumer.port$$:: $$the port on which the Netty server listens. Default is 9292$$ *($$Integer$$, default: `$$9292$$`)*
-$$websocket.consumer.ssl$$:: $$whether or not to create a {@link io.netty.handler.ssl.SslContext}.$$ *($$Boolean$$, default: `$$false$$`)*
-$$websocket.consumer.threads$$:: $$the number of threads for the Netty {@link io.netty.channel.EventLoopGroup}. Default is 1$$ *($$Integer$$, default: `$$1$$`)*
+$$websocket.consumer.log-level$$:: $$The logLevel for netty channels. Default is WARN$$ *($$String$$, default: `$$$$`)*
+$$websocket.consumer.path$$:: $$The path on which a WebsocketSink consumer needs to connect. Default is /websocket$$ *($$String$$, default: `$$/websocket$$`)*
+$$websocket.consumer.port$$:: $$The port on which the Netty server listens. Default is 9292$$ *($$Integer$$, default: `$$9292$$`)*
+$$websocket.consumer.ssl$$:: $$Whether to create a {@link io.netty.handler.ssl.SslContext}.$$ *($$Boolean$$, default: `$$false$$`)*
+$$websocket.consumer.threads$$:: $$The number of threads for the Netty {@link io.netty.channel.EventLoopGroup}. Default is 1$$ *($$Integer$$, default: `$$1$$`)*
//end::configuration-properties[]
== Examples
diff --git a/applications/sink/websocket-sink/pom.xml b/applications/sink/websocket-sink/pom.xml
index 68b4f59f9..bfaf12984 100644
--- a/applications/sink/websocket-sink/pom.xml
+++ b/applications/sink/websocket-sink/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- websocket-consumer
+ spring-websocket-consumer
org.springframework.boot
@@ -26,7 +26,7 @@
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
test
@@ -53,13 +53,13 @@
websocket
sink
${project.version}
- org.springframework.cloud.fn.consumer.websocket.WebsocketConsumerConfiguration.class
-
+ AUTOCONFIGURATION
+ websocketConsumer
org.springframework.cloud.fn
- websocket-consumer
+ spring-websocket-consumer
diff --git a/applications/sink/xmpp-sink/README.adoc b/applications/sink/xmpp-sink/README.adoc
index 92d0bae36..82de01d05 100644
--- a/applications/sink/xmpp-sink/README.adoc
+++ b/applications/sink/xmpp-sink/README.adoc
@@ -29,8 +29,8 @@ $$chat-to$$:: $$XMPP handle to send message to.$$ *($$String$$, default: `$$$$`)*
$$password$$:: $$The Password for the connected user.$$ *($$String$$, default: `$$$$`)*
-$$port$$:: $$Port for connecting to the host. - Default Client Port: 5222$$ *($$Integer$$, default: `$$5222$$`)*
-$$resource$$:: $$The Resource to bind to on the XMPP Host. - Can be empty, server will generate one if not set$$ *($$String$$, default: `$$$$`)*
+$$port$$:: $$Port for connecting to the host. - Default Client Port: 5222$$ *($$Integer$$, default: `$$5222$$`)*
+$$resource$$:: $$The Resource to bind to on the XMPP Host. - Can be empty, server will generate one if not set$$ *($$String$$, default: `$$$$`)*
$$security-mode$$:: $$$$ *($$SecurityMode$$, default: `$$$$`, possible values: `required`,`ifpossible`,`disabled`)*
$$service-name$$:: $$The Service Name to set for the XMPP Domain.$$ *($$String$$, default: `$$$$`)*
$$subscription-mode$$:: $$$$ *($$SubscriptionMode$$, default: `$$$$`, possible values: `accept_all`,`reject_all`,`manual`)*
diff --git a/applications/sink/xmpp-sink/pom.xml b/applications/sink/xmpp-sink/pom.xml
index 647a1a5a9..a1069495e 100644
--- a/applications/sink/xmpp-sink/pom.xml
+++ b/applications/sink/xmpp-sink/pom.xml
@@ -20,7 +20,7 @@
org.springframework.cloud.fn
- xmpp-consumer
+ spring-xmpp-consumer
@@ -37,7 +37,7 @@
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
${project.version}
test
@@ -66,14 +66,13 @@
xmpp
sink
${project.version}
- org.springframework.cloud.fn.consumer.xmpp.XmppConsumerConfiguration.class
-
-
+ AUTOCONFIGURATION
+ xmppConsumer
org.springframework.cloud.fn
- xmpp-consumer
+ spring-xmpp-consumer
org.springframework.cloud.stream.app
diff --git a/applications/sink/zeromq-sink/pom.xml b/applications/sink/zeromq-sink/pom.xml
index 5ba060ce9..52a49665d 100644
--- a/applications/sink/zeromq-sink/pom.xml
+++ b/applications/sink/zeromq-sink/pom.xml
@@ -20,7 +20,7 @@
org.springframework.cloud.fn
- zeromq-consumer
+ spring-zeromq-consumer
@@ -58,14 +58,13 @@
zeromq
sink
${project.version}
- org.springframework.cloud.fn.consumer.zeromq.ZeroMqConsumerConfiguration.class
-
-
+ AUTOCONFIGURATION
+ zeromqConsumer
org.springframework.cloud.fn
- zeromq-consumer
+ spring-zeromq-consumer
org.springframework.cloud.stream.app
diff --git a/applications/source/debezium-source/README.adoc b/applications/source/debezium-source/README.adoc
index 263450512..4cb6fb3d8 100644
--- a/applications/source/debezium-source/README.adoc
+++ b/applications/source/debezium-source/README.adoc
@@ -24,14 +24,14 @@ Properties grouped by prefix:
=== debezium
$$debezium-native-configuration$$:: $$$$ *($$Properties$$, default: `$$$$`)*
-$$header-format$$:: $${@link ChangeEvent} header format. Defaults to 'JSON'.$$ *($$DebeziumFormat$$, default: `$$$$`, possible values: `JSON`,`AVRO`,`PROTOBUF`)*
$$offset-commit-policy$$:: $$The policy that defines when the offsets should be committed to offset storage.$$ *($$DebeziumOffsetCommitPolicy$$, default: `$$$$`, possible values: `ALWAYS`,`PERIODIC`,`DEFAULT`)*
-$$payload-format$$:: $${@link ChangeEvent} Key and Payload formats. Defaults to 'JSON'.$$ *($$DebeziumFormat$$, default: `$$$$`, possible values: `JSON`,`AVRO`,`PROTOBUF`)*
+$$payload-format$$:: $${@code io.debezium.engine.ChangeEvent} Key and Payload formats. Defaults to 'JSON'.$$ *($$DebeziumFormat$$, default: `$$$$`, possible values: `JSON`,`AVRO`,`PROTOBUF`)*
$$properties$$:: $$Spring pass-trough wrapper for debezium configuration properties. All properties with a 'debezium.properties.*' prefix are native Debezium properties.$$ *($$Map$$, default: `$$$$`)*
=== debezium.supplier
-$$copy-headers$$:: $$Copy Change Event headers into Message headers.$$ *($$Boolean$$, default: `$$true$$`)*
+$$enable-empty-payload$$:: $$Enable support for tombstone (aka delete) messages.$$ *($$Boolean$$, default: `$$true$$`)*
+$$header-names-to-map$$:: $$Patterns for {@code ChangeEvent.headers()} to map.$$ *($$String[]$$, default: `$$[*]$$`)*
//end::configuration-properties[]
==== Event flattening configuration
diff --git a/applications/source/debezium-source/pom.xml b/applications/source/debezium-source/pom.xml
index f3eb9e9f5..aeef35f6b 100644
--- a/applications/source/debezium-source/pom.xml
+++ b/applications/source/debezium-source/pom.xml
@@ -22,15 +22,13 @@
org.springframework.cloud.fn
- debezium-supplier
- ${java-functions.version}
+ spring-debezium-supplier
org.springframework.cloud.fn
- function-test-support
- ${java-functions.version}
+ spring-function-test-support
test
@@ -159,7 +157,7 @@
org.springframework.cloud.fn
- debezium-supplier
+ spring-debezium-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumDeleteHandlingIntegrationTest.java b/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumDeleteHandlingIntegrationTest.java
index 6b331df84..83499adf5 100644
--- a/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumDeleteHandlingIntegrationTest.java
+++ b/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumDeleteHandlingIntegrationTest.java
@@ -18,6 +18,7 @@
import java.time.Duration;
import java.util.List;
+import java.util.Optional;
import org.junit.jupiter.api.Tag;
import org.junit.jupiter.params.ParameterizedTest;
@@ -26,19 +27,15 @@
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
-import org.springframework.boot.test.context.FilteredClassLoader;
import org.springframework.boot.test.context.runner.ApplicationContextRunner;
import org.springframework.boot.test.context.runner.ContextConsumer;
import org.springframework.cloud.fn.common.debezium.DebeziumProperties;
-import org.springframework.cloud.fn.supplier.debezium.DebeziumReactiveConsumerConfiguration;
import org.springframework.cloud.stream.binder.test.OutputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ApplicationContext;
import org.springframework.jdbc.core.JdbcTemplate;
-import org.springframework.kafka.support.KafkaNull;
import org.springframework.messaging.Message;
import org.springframework.test.jdbc.JdbcTestUtils;
-import org.springframework.util.ClassUtils;
import static org.assertj.core.api.Assertions.assertThat;
@@ -106,10 +103,6 @@ public class DebeziumDeleteHandlingIntegrationTest {
"debezium.properties.transforms.unwrap.delete.handling.mode=rewrite,debezium.properties.transforms.unwrap.drop.tombstones=false"
})
public void handleRecordDeletions(String properties) {
- contextRunner.withPropertyValues(properties.split(","))
- .withClassLoader(new FilteredClassLoader(KafkaNull.class)) // Remove Kafka from the
- .run(consumer);
-
contextRunner.withPropertyValues(properties.split(","))
.run(consumer);
}
@@ -123,10 +116,6 @@ private String toString(Object object) {
JdbcTemplate jdbcTemplate = context.getBean(JdbcTemplate.class);
DebeziumProperties props = context.getBean(DebeziumProperties.class);
- boolean isKafkaPresent = ClassUtils.isPresent(
- DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL,
- context.getClassLoader());
-
String deleteHandlingMode = props.getProperties().get("transforms.unwrap.delete.handling.mode");
String isDropTombstones = props.getProperties().get("transforms.unwrap.drop.tombstones");
@@ -153,15 +142,14 @@ else if (deleteHandlingMode.equals("none")) {
else if (deleteHandlingMode.equals("rewrite")) {
received = outputDestination.receive(Duration.ofSeconds(10).toMillis(), DebeziumTestUtils.BINDING_NAME);
assertThat(received).isNotNull();
- assertThat(toString(received.getPayload()).contains("\"__deleted\":\"true\""));
+ assertThat(toString(received.getPayload())).contains("\"__deleted\":\"true\"");
}
- if (!(isDropTombstones.equals("true")) && isKafkaPresent) {
+ if (!(isDropTombstones.equals("true"))) {
received = outputDestination.receive(Duration.ofSeconds(10).toMillis(), DebeziumTestUtils.BINDING_NAME);
assertThat(received).isNotNull();
// Tombstones event should have KafkaNull payload
- assertThat(received.getPayload().getClass().getCanonicalName())
- .isEqualTo(DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL);
+ assertThat(received.getPayload()).isEqualTo(Optional.empty());
Object keyRaw = received.getHeaders().get("debezium_key");
String key = (keyRaw instanceof byte[]) ? new String((byte[]) keyRaw) : "" + keyRaw;
diff --git a/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumFlatteningIntegrationTest.java b/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumFlatteningIntegrationTest.java
index 9f95965af..9e392d66b 100644
--- a/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumFlatteningIntegrationTest.java
+++ b/applications/source/debezium-source/src/test/java/org/springframework/cloud/stream/app/source/debezium/integration/DebeziumFlatteningIntegrationTest.java
@@ -18,6 +18,7 @@
import java.time.Duration;
import java.util.List;
+import java.util.Optional;
import net.javacrumbs.jsonunit.JsonAssert;
import net.javacrumbs.jsonunit.core.Configuration;
@@ -27,19 +28,15 @@
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
-import org.springframework.boot.test.context.FilteredClassLoader;
import org.springframework.boot.test.context.runner.ApplicationContextRunner;
import org.springframework.boot.test.context.runner.ContextConsumer;
import org.springframework.cloud.fn.common.debezium.DebeziumProperties;
-import org.springframework.cloud.fn.supplier.debezium.DebeziumReactiveConsumerConfiguration;
import org.springframework.cloud.stream.binder.test.OutputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ApplicationContext;
import org.springframework.jdbc.core.JdbcTemplate;
-import org.springframework.kafka.support.KafkaNull;
import org.springframework.messaging.Message;
import org.springframework.test.jdbc.JdbcTestUtils;
-import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import static org.assertj.core.api.Assertions.assertThat;
@@ -96,14 +93,7 @@ public class DebeziumFlatteningIntegrationTest {
"app.datasource.type=com.zaxxer.hikari.HikariDataSource");
@Test
- public void noFlattenedResponseNoKafka() {
- contextRunner
- .withClassLoader(new FilteredClassLoader(KafkaNull.class)) // Remove Kafka from the classpath
- .run(noFlatteningTest);
- }
-
- @Test
- public void noFlattenedResponseWithKafka() {
+ public void noFlattenedResponse() {
contextRunner.run(noFlatteningTest);
}
@@ -111,15 +101,11 @@ public void noFlattenedResponseWithKafka() {
OutputDestination outputDestination = context.getBean(OutputDestination.class);
JdbcTemplate jdbcTemplate = context.getBean(JdbcTemplate.class);
- boolean isKafkaPresent = ClassUtils.isPresent(
- DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL,
- context.getClassLoader());
-
List> messages = DebeziumTestUtils.receiveAll(outputDestination);
assertThat(messages).hasSizeGreaterThanOrEqualTo(52);
JsonAssert.assertJsonEquals(DebeziumTestUtils.resourceToString(
- "classpath:/json/mysql_ddl_drop_inventory_address_table.json"),
+ "classpath:/json/mysql_ddl_drop_inventory_address_table.json"),
toString(messages.get(1).getPayload()),
Configuration.empty().whenIgnoringPaths("schemaName", "tableChanges", "source.sequence",
"source.ts_ms", "ts_ms"));
@@ -143,7 +129,7 @@ public void noFlattenedResponseWithKafka() {
messages = DebeziumTestUtils.receiveAll(outputDestination);
- assertThat(messages).hasSize(isKafkaPresent ? 4 : 3);
+ assertThat(messages).hasSize(4);
JsonAssert.assertJsonEquals(
DebeziumTestUtils.resourceToString("classpath:/json/mysql_update_inventory_customers.json"),
@@ -160,30 +146,13 @@ public void noFlattenedResponseWithKafka() {
JsonAssert.assertJsonEquals("{\"id\":" + newRecordId + "}",
toString(messages.get(1).getHeaders().get("debezium_key")));
- if (isKafkaPresent) {
- assertThat(messages.get(3).getPayload().getClass().getCanonicalName())
- .isEqualTo(DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL,
- "Tombstones event should have KafkaNull payload");
- assertThat(messages.get(3).getHeaders().get("debezium_destination"))
- .isEqualTo("my-topic.inventory.customers");
- JsonAssert.assertJsonEquals("{\"id\":" + newRecordId + "}",
- toString(messages.get(3).getHeaders().get("debezium_key")));
- }
+ assertThat(messages.get(3).getPayload()).isEqualTo(Optional.empty());
+ assertThat(messages.get(3).getHeaders().get("debezium_destination"))
+ .isEqualTo("my-topic.inventory.customers");
+ JsonAssert.assertJsonEquals("{\"id\":" + newRecordId + "}",
+ toString(messages.get(3).getHeaders().get("debezium_key")));
};
- @Test
- public void flattenedResponseNoKafka() {
- contextRunner
- .withPropertyValues("debezium.properties.transforms=unwrap",
- "debezium.properties.transforms.unwrap.type=io.debezium.transforms.ExtractNewRecordState",
- "debezium.properties.transforms.unwrap.add.fields=name,db,op",
- "debezium.properties.transforms.unwrap.add.headers=name,op",
- "debezium.properties.transforms.unwrap.delete.handling.mode=none",
- "debezium.properties.transforms.unwrap.drop.tombstones=false")
- .withClassLoader(new FilteredClassLoader(KafkaNull.class)) // Remove Kafka from the classpath
- .run(flatteningTest);
- }
-
@Test
public void flattenedResponseWithKafka() {
contextRunner
@@ -212,10 +181,6 @@ public void flattenedResponseWithKafkaDropTombstone() {
OutputDestination outputDestination = context.getBean(OutputDestination.class);
JdbcTemplate jdbcTemplate = context.getBean(JdbcTemplate.class);
- boolean isKafkaPresent = ClassUtils.isPresent(
- DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL,
- context.getClassLoader());
-
List> messages = DebeziumTestUtils.receiveAll(outputDestination);
assertThat(messages).hasSizeGreaterThanOrEqualTo(52);
@@ -225,7 +190,7 @@ public void flattenedResponseWithKafkaDropTombstone() {
String isDropTombstones = props.getProperties().get("transforms.unwrap.drop.tombstones");
JsonAssert.assertJsonEquals(DebeziumTestUtils.resourceToString(
- "classpath:/json/mysql_ddl_drop_inventory_address_table.json"),
+ "classpath:/json/mysql_ddl_drop_inventory_address_table.json"),
toString(messages.get(1).getPayload()),
Configuration.empty().whenIgnoringPaths("schemaName", "tableChanges", "source.sequence",
"source.ts_ms", "ts_ms"));
@@ -256,7 +221,7 @@ public void flattenedResponseWithKafkaDropTombstone() {
messages = DebeziumTestUtils.receiveAll(outputDestination);
- assertThat(messages).hasSize((isDropTombstones.equals("false") && isKafkaPresent) ? 4 : 3);
+ assertThat(messages).hasSize((isDropTombstones.equals("false")) ? 4 : 3);
JsonAssert.assertJsonEquals(
DebeziumTestUtils.resourceToString("classpath:/json/mysql_flattened_update_inventory_customers.json"),
@@ -273,10 +238,8 @@ public void flattenedResponseWithKafkaDropTombstone() {
toString(messages.get(1).getHeaders().get("debezium_key")));
}
- if (isDropTombstones.equals("false") && isKafkaPresent) {
- assertThat(messages.get(3).getPayload().getClass().getCanonicalName())
- .isEqualTo(DebeziumReactiveConsumerConfiguration.ORG_SPRINGFRAMEWORK_KAFKA_SUPPORT_KAFKA_NULL,
- "Tombstones event should have KafkaNull payload");
+ if (isDropTombstones.equals("false")) {
+ assertThat(messages.get(3).getPayload()).isEqualTo(Optional.empty());
assertThat(messages.get(3).getHeaders().get("debezium_destination"))
.isEqualTo("my-topic.inventory.customers");
JsonAssert.assertJsonEquals("{\"id\":" + newRecordId + "}",
diff --git a/applications/source/file-source/pom.xml b/applications/source/file-source/pom.xml
index 4f510888b..914b528a1 100644
--- a/applications/source/file-source/pom.xml
+++ b/applications/source/file-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- file-supplier
+ spring-file-supplier
org.springframework.cloud.stream.app
@@ -49,13 +49,13 @@
file
source
${project.version}
- org.springframework.cloud.fn.supplier.file.FileSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ fileSupplier
org.springframework.cloud.fn
- file-supplier
+ spring-file-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/ftp-source/pom.xml b/applications/source/ftp-source/pom.xml
index 053d13b62..7b73bc724 100644
--- a/applications/source/ftp-source/pom.xml
+++ b/applications/source/ftp-source/pom.xml
@@ -17,11 +17,17 @@
org.springframework.cloud.fn
- ftp-supplier
+ spring-ftp-supplier
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
+ test
+
+
+ org.apache.ftpserver
+ ftpserver-core
+ 1.2.0
test
@@ -54,13 +60,13 @@
ftp
source
${project.version}
- org.springframework.cloud.fn.supplier.ftp.FtpSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ ftpSupplier
org.springframework.cloud.fn
- ftp-supplier
+ spring-ftp-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/ftp-source/src/test/java/org/springframework/cloud/stream/app/source/ftp/FtpSourceTests.java b/applications/source/ftp-source/src/test/java/org/springframework/cloud/stream/app/source/ftp/FtpSourceTests.java
index 46430cd42..9492d4749 100644
--- a/applications/source/ftp-source/src/test/java/org/springframework/cloud/stream/app/source/ftp/FtpSourceTests.java
+++ b/applications/source/ftp-source/src/test/java/org/springframework/cloud/stream/app/source/ftp/FtpSourceTests.java
@@ -23,7 +23,6 @@
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.test.context.SpringBootTest;
-import org.springframework.cloud.fn.supplier.ftp.FtpSupplierConfiguration;
import org.springframework.cloud.fn.supplier.ftp.FtpSupplierProperties;
import org.springframework.cloud.fn.test.support.ftp.FtpTestSupport;
import org.springframework.cloud.stream.binder.test.OutputDestination;
@@ -54,7 +53,7 @@ public class FtpSourceTests extends FtpTestSupport {
FtpSupplierProperties config;
@Test
- public void testFtpource() {
+ public void testFtpSource() {
Message message = output.receive(10000, "ftpSupplier-out-0");
assertThat(new File(new String(message.getPayload()).replaceAll("\"", ""))).isEqualTo(
new File(this.config.getLocalDir(), "ftpSource1.txt"));
@@ -64,7 +63,7 @@ public void testFtpource() {
}
@SpringBootApplication
- @Import({TestChannelBinderConfiguration.class, FtpSupplierConfiguration.class})
+ @Import(TestChannelBinderConfiguration.class)
public static class SampleConfiguration {
@Bean
@@ -77,4 +76,5 @@ public FtpSupplierProperties ftpSupplierProperties() {
}
}
+
}
diff --git a/applications/source/http-source/README.adoc b/applications/source/http-source/README.adoc
index 06ee12627..786572e30 100644
--- a/applications/source/http-source/README.adoc
+++ b/applications/source/http-source/README.adoc
@@ -23,13 +23,13 @@ The **$$http$$** $$source$$ supports the following configuration properties:
Properties grouped by prefix:
-=== http.cors
+=== http.supplier.cors
$$allow-credentials$$:: $$Whether the browser should include any cookies associated with the domain of the request being annotated.$$ *($$Boolean$$, default: `$$$$`)*
$$allowed-headers$$:: $$List of request headers that can be used during the actual request.$$ *($$String[]$$, default: `$$$$`)*
$$allowed-origins$$:: $$List of allowed origins, e.g. https://domain1.com.$$ *($$String[]$$, default: `$$$$`)*
-=== http
+=== http.supplier
$$mapped-request-headers$$:: $$Headers that will be mapped.$$ *($$String[]$$, default: `$$$$`)*
$$path-pattern$$:: $$HTTP endpoint path mapping.$$ *($$String$$, default: `$$/$$`)*
diff --git a/applications/source/http-source/pom.xml b/applications/source/http-source/pom.xml
index 3fe7500e0..33ec02423 100644
--- a/applications/source/http-source/pom.xml
+++ b/applications/source/http-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- http-supplier
+ spring-http-supplier
org.springframework.boot
@@ -53,8 +53,8 @@
http
source
${project.version}
- org.springframework.cloud.fn.supplier.http.HttpSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ httpSupplier
reactive>
false
@@ -65,7 +65,7 @@
org.springframework.cloud.fn
- http-supplier
+ spring-http-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/jdbc-source/pom.xml b/applications/source/jdbc-source/pom.xml
index 733a66fdb..a97cbaf20 100644
--- a/applications/source/jdbc-source/pom.xml
+++ b/applications/source/jdbc-source/pom.xml
@@ -17,14 +17,9 @@
org.springframework.cloud.fn
- jdbc-supplier
+ spring-jdbc-supplier
- org.springframework.cloud
- spring-cloud-stream-test-binder
- test
-
-
com.h2database
h2
test
@@ -58,13 +53,13 @@
jdbc
source
${project.version}
- org.springframework.cloud.fn.supplier.jdbc.JdbcSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ jdbcSupplier
org.springframework.cloud.fn
- jdbc-supplier
+ spring-jdbc-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/jms-source/README.adoc b/applications/source/jms-source/README.adoc
index 6cc0644dc..113c61990 100644
--- a/applications/source/jms-source/README.adoc
+++ b/applications/source/jms-source/README.adoc
@@ -28,10 +28,11 @@ $$pub-sub-domain$$:: $$Whether the default destination type is topic.$$ *($$Bool
=== spring.jms.listener
-$$acknowledge-mode$$:: $$Acknowledge mode of the container. By default, the listener is transacted with automatic acknowledgment.$$ *($$AcknowledgeMode$$, default: `$$$$`, possible values: `AUTO`,`CLIENT`,`DUPS_OK`)*
+$$acknowledge-mode$$:: $$$$ *($$AcknowledgeMode$$, default: `$$$$`)*
$$auto-startup$$:: $$Start the container automatically on startup.$$ *($$Boolean$$, default: `$$true$$`)*
-$$concurrency$$:: $$Minimum number of concurrent consumers. When max-concurrency is not specified the minimum will also be used as the maximum.$$ *($$Integer$$, default: `$$$$`)*
+$$concurrency$$:: $$$$ *($$Integer$$, default: `$$$$`)*
$$max-concurrency$$:: $$Maximum number of concurrent consumers.$$ *($$Integer$$, default: `$$$$`)*
+$$min-concurrency$$:: $$Minimum number of concurrent consumers. When max-concurrency is not specified the minimum will also be used as the maximum.$$ *($$Integer$$, default: `$$$$`)*
$$receive-timeout$$:: $$Timeout to use for receive calls. Use -1 for a no-wait receive or 0 for no timeout at all. The latter is only feasible if not running within a transaction manager and is generally discouraged since it prevents clean shutdown.$$ *($$Duration$$, default: `$$1s$$`)*
//end::configuration-properties[]
diff --git a/applications/source/jms-source/pom.xml b/applications/source/jms-source/pom.xml
index 7f970c16f..1f7b00b0b 100644
--- a/applications/source/jms-source/pom.xml
+++ b/applications/source/jms-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- jms-supplier
+ spring-jms-supplier
jakarta.jms
@@ -59,13 +59,13 @@
jms
source
${project.version}
- org.springframework.cloud.fn.supplier.jms.JmsSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ jmsSupplier
org.springframework.cloud.fn
- jms-supplier
+ spring-jms-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/kafka-source/README.adoc b/applications/source/kafka-source/README.adoc
index d495cd612..8a425c1a3 100644
--- a/applications/source/kafka-source/README.adoc
+++ b/applications/source/kafka-source/README.adoc
@@ -50,6 +50,7 @@ $$ack-mode$$:: $$Listener AckMode. See the spring-kafka documentation.$$ *($$Ack
$$ack-time$$:: $$Time between offset commits when ackMode is "TIME" or "COUNT_TIME".$$ *($$Duration$$, default: `$$$$`)*
$$async-acks$$:: $$Support for asynchronous record acknowledgements. Only applies when spring.kafka.listener.ack-mode is manual or manual-immediate.$$ *($$Boolean$$, default: `$$$$`)*
$$auto-startup$$:: $$Whether to auto start the container.$$ *($$Boolean$$, default: `$$true$$`)*
+$$change-consumer-thread-name$$:: $$Whether to instruct the container to change the consumer thread name during initialization.$$ *($$Boolean$$, default: `$$$$`)*
$$client-id$$:: $$Prefix for the listener's consumer client.id property.$$ *($$String$$, default: `$$$$`)*
$$concurrency$$:: $$Number of threads to run in the listener containers.$$ *($$Integer$$, default: `$$$$`)*
$$idle-between-polls$$:: $$Sleep interval between Consumer.poll(Duration) calls.$$ *($$Duration$$, default: `$$0$$`)*
@@ -60,6 +61,7 @@ $$log-container-config$$:: $$Whether to log the container configuration during i
$$missing-topics-fatal$$:: $$Whether the container should fail to start if at least one of the configured topics are not present on the broker.$$ *($$Boolean$$, default: `$$false$$`)*
$$monitor-interval$$:: $$Time between checks for non-responsive consumers. If a duration suffix is not specified, seconds will be used.$$ *($$Duration$$, default: `$$$$`)*
$$no-poll-threshold$$:: $$Multiplier applied to "pollTimeout" to determine if a consumer is non-responsive.$$ *($$Float$$, default: `$$$$`)*
+$$observation-enabled$$:: $$Whether to enable observation.$$ *($$Boolean$$, default: `$$false$$`)*
$$poll-timeout$$:: $$Timeout to use when polling the consumer.$$ *($$Duration$$, default: `$$$$`)*
$$type$$:: $$Listener type.$$ *($$Type$$, default: `$$single$$`)*
//end::configuration-properties[]
diff --git a/applications/source/kafka-source/pom.xml b/applications/source/kafka-source/pom.xml
index 50ca5b997..4372d8a24 100644
--- a/applications/source/kafka-source/pom.xml
+++ b/applications/source/kafka-source/pom.xml
@@ -19,7 +19,7 @@
org.springframework.cloud.fn
- kafka-supplier
+ spring-kafka-supplier
@@ -58,7 +58,7 @@
org.springframework.cloud.fn
- kafka-supplier
+ spring-kafka-supplier
diff --git a/applications/source/load-generator-source/README.adoc b/applications/source/load-generator-source/README.adoc
index 1a3fd5722..4ea48e26e 100644
--- a/applications/source/load-generator-source/README.adoc
+++ b/applications/source/load-generator-source/README.adoc
@@ -8,10 +8,10 @@ A source that sends generated data and dispatches it to the stream.
The **$$load-generator$$** $$source$$ has the following options:
//tag::configuration-properties[]
-$$load-generator.generate-timestamp$$:: $$Whether timestamp generated.$$ *($$Boolean$$, default: `$$$$`)*
-$$load-generator.message-count$$:: $$Message count.$$ *($$Integer$$, default: `$$$$`)*
-$$load-generator.message-size$$:: $$Message size.$$ *($$Integer$$, default: `$$$$`)*
-$$load-generator.producers$$:: $$Number of producers.$$ *($$Integer$$, default: `$$$$`)*
+$$load-generator.generate-timestamp$$:: $$Whether timestamp generated.$$ *($$Boolean$$, default: `$$false$$`)*
+$$load-generator.message-count$$:: $$Message count.$$ *($$Integer$$, default: `$$1000$$`)*
+$$load-generator.message-size$$:: $$Message size.$$ *($$Integer$$, default: `$$1000$$`)*
+$$load-generator.producers$$:: $$Number of producers.$$ *($$Integer$$, default: `$$1$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/source/mail-source/README.adoc b/applications/source/mail-source/README.adoc
index a08737595..d90829cc4 100644
--- a/applications/source/mail-source/README.adoc
+++ b/applications/source/mail-source/README.adoc
@@ -12,7 +12,7 @@ $$mail.supplier.charset$$:: $$The charset for byte[] mail-to-string transformati
$$mail.supplier.delete$$:: $$Set to true to delete email after download.$$ *($$Boolean$$, default: `$$false$$`)*
$$mail.supplier.expression$$:: $$Configure a SpEL expression to select messages.$$ *($$String$$, default: `$$true$$`)*
$$mail.supplier.idle-imap$$:: $$Set to true to use IdleImap Configuration.$$ *($$Boolean$$, default: `$$false$$`)*
-$$mail.supplier.java-mail-properties$$:: $$JavaMail properties as a new line delimited string of name-value pairs, e.g. 'foo=bar\n baz=car'.$$ *($$Properties$$, default: `$$$$`)*
+$$mail.supplier.java-mail-properties$$:: $$Java Mail properties as a new line delimited string of name-value pairs, e.g. 'foo=bar\n baz=car'.$$ *($$Properties$$, default: `$$$$`)*
$$mail.supplier.mark-as-read$$:: $$Set to true to mark email as read.$$ *($$Boolean$$, default: `$$false$$`)*
$$mail.supplier.url$$:: $$Mail connection URL for connection to Mail server e.g. 'imaps://username:password@imap.server.com:993/Inbox'.$$ *($$URLName$$, default: `$$$$`)*
$$mail.supplier.user-flag$$:: $$The flag to mark messages when the server does not support \Recent.$$ *($$String$$, default: `$$$$`)*
diff --git a/applications/source/mail-source/pom.xml b/applications/source/mail-source/pom.xml
index 316d57f06..ee6edb9f4 100644
--- a/applications/source/mail-source/pom.xml
+++ b/applications/source/mail-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- mail-supplier
+ spring-mail-supplier
org.springframework.integration
@@ -26,19 +26,9 @@
com.icegreen
- greenmail-junit5
- 2.0.0-alpha-3
+ greenmail
+ 2.1.0-alpha-4
test
-
-
- com.sun.mail
- jakarta.mail
-
-
- jakarta.activation
- jakarta.activation-api
-
-
@@ -64,13 +54,13 @@
mail
source
${project.version}
- org.springframework.cloud.fn.supplier.mail.MailSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ mailSupplier
org.springframework.cloud.fn
- mail-supplier
+ spring-mail-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/mongodb-source/pom.xml b/applications/source/mongodb-source/pom.xml
index 573878021..64810a915 100644
--- a/applications/source/mongodb-source/pom.xml
+++ b/applications/source/mongodb-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- mongodb-supplier
+ spring-mongodb-supplier
@@ -43,14 +43,13 @@
mongodb
source
${project.version}
- org.springframework.cloud.fn.supplier.mongo.MongodbSupplierConfiguration.class
-
-
+ AUTOCONFIGURATION
+ mongodbSupplier
org.springframework.cloud.fn
- mongodb-supplier
+ spring-mongodb-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/mqtt-source/README.adoc b/applications/source/mqtt-source/README.adoc
index 14fe33c8d..b0d6b32e3 100644
--- a/applications/source/mqtt-source/README.adoc
+++ b/applications/source/mqtt-source/README.adoc
@@ -30,11 +30,11 @@ $$username$$:: $$the username to use when connecting to the broker.$$ *($$String
=== mqtt.supplier
-$$binary$$:: $$true to leave the payload as bytes.$$ *($$Boolean$$, default: `$$false$$`)*
-$$charset$$:: $$the charset used to convert bytes to String (when binary is false).$$ *($$String$$, default: `$$UTF-8$$`)*
-$$client-id$$:: $$identifies the client.$$ *($$String$$, default: `$$stream.client.id.source$$`)*
-$$qos$$:: $$the qos; a single value for all topics or a comma-delimited list to match the topics.$$ *($$Integer[]$$, default: `$$[0]$$`)*
-$$topics$$:: $$the topic(s) (comma-delimited) to which the source will subscribe.$$ *($$String[]$$, default: `$$[stream.mqtt]$$`)*
+$$binary$$:: $$True to leave the payload as bytes.$$ *($$Boolean$$, default: `$$false$$`)*
+$$charset$$:: $$The charset used to convert bytes to String (when binary is false).$$ *($$String$$, default: `$$UTF-8$$`)*
+$$client-id$$:: $$Identifies the client.$$ *($$String$$, default: `$$stream.client.id.source$$`)*
+$$qos$$:: $$The qos; a single value for all topics or a comma-delimited list to match the topics.$$ *($$Integer[]$$, default: `$$[0]$$`)*
+$$topics$$:: $$The topic(s) (comma-delimited) to which the source will subscribe.$$ *($$String[]$$, default: `$$[stream.mqtt]$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/source/mqtt-source/pom.xml b/applications/source/mqtt-source/pom.xml
index 78bc48708..b9171dca7 100644
--- a/applications/source/mqtt-source/pom.xml
+++ b/applications/source/mqtt-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- mqtt-supplier
+ spring-mqtt-supplier
org.testcontainers
@@ -49,13 +49,13 @@
mqtt
source
${project.version}
- org.springframework.cloud.fn.supplier.mqtt.MqttSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ mqttSupplier
org.springframework.cloud.fn
- mqtt-supplier
+ spring-mqtt-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/rabbit-source/README.adoc b/applications/source/rabbit-source/README.adoc
index 16cf2e66a..409138ded 100644
--- a/applications/source/rabbit-source/README.adoc
+++ b/applications/source/rabbit-source/README.adoc
@@ -31,7 +31,6 @@ $$initial-retry-interval$$:: $$Initial retry interval when retry is enabled.$$ *
$$mapped-request-headers$$:: $$Headers that will be mapped.$$ *($$String[]$$, default: `$$[STANDARD_REQUEST_HEADERS]$$`)*
$$max-attempts$$:: $$The maximum delivery attempts when retry is enabled.$$ *($$Integer$$, default: `$$3$$`)*
$$max-retry-interval$$:: $$Max retry interval when retry is enabled.$$ *($$Integer$$, default: `$$30000$$`)*
-$$own-connection$$:: $$When true, use a separate connection based on the boot properties.$$ *($$Boolean$$, default: `$$false$$`)*
$$queues$$:: $$The queues to which the source will listen for messages.$$ *($$String[]$$, default: `$$$$`)*
$$requeue$$:: $$Whether rejected messages should be requeued.$$ *($$Boolean$$, default: `$$true$$`)*
$$retry-multiplier$$:: $$Retry backoff multiplier when retry is enabled.$$ *($$Double$$, default: `$$2$$`)*
@@ -44,6 +43,7 @@ $$addresses$$:: $$Comma-separated list of addresses to which the client should c
$$channel-rpc-timeout$$:: $$Continuation timeout for RPC calls in channels. Set it to zero to wait forever.$$ *($$Duration$$, default: `$$10m$$`)*
$$connection-timeout$$:: $$Connection timeout. Set it to zero to wait forever.$$ *($$Duration$$, default: `$$$$`)*
$$host$$:: $$RabbitMQ host. Ignored if an address is set.$$ *($$String$$, default: `$$localhost$$`)*
+$$max-inbound-message-body-size$$:: $$Maximum size of the body of inbound (received) messages.$$ *($$DataSize$$, default: `$$64MB$$`)*
$$password$$:: $$Login to authenticate against the broker.$$ *($$String$$, default: `$$guest$$`)*
$$port$$:: $$RabbitMQ port. Ignored if an address is set. Default to 5672, or 5671 if SSL is enabled.$$ *($$Integer$$, default: `$$$$`)*
$$publisher-confirm-type$$:: $$Type of publisher confirms to use.$$ *($$ConfirmType$$, default: `$$$$`, possible values: `SIMPLE`,`CORRELATED`,`NONE`)*
@@ -62,6 +62,7 @@ $$concurrency$$:: $$Minimum number of listener invoker threads.$$ *($$Integer$$,
$$consumer-batch-enabled$$:: $$Whether the container creates a batch of messages based on the 'receive-timeout' and 'batch-size'. Coerces 'de-batching-enabled' to true to include the contents of a producer created batch in the batch as discrete records.$$ *($$Boolean$$, default: `$$false$$`)*
$$de-batching-enabled$$:: $$Whether the container should present batched messages as discrete messages or call the listener with the batch.$$ *($$Boolean$$, default: `$$true$$`)*
$$default-requeue-rejected$$:: $$Whether rejected deliveries are re-queued by default.$$ *($$Boolean$$, default: `$$$$`)*
+$$force-stop$$:: $$Whether the container (when stopped) should stop immediately after processing the current message or stop after processing all pre-fetched messages.$$ *($$Boolean$$, default: `$$false$$`)*
$$idle-event-interval$$:: $$How often idle container events should be published.$$ *($$Duration$$, default: `$$$$`)*
$$max-concurrency$$:: $$Maximum number of listener invoker threads.$$ *($$Integer$$, default: `$$$$`)*
$$missing-queues-fatal$$:: $$Whether to fail if the queues declared by the container are not available on the broker and/or whether to stop the container if one or more queues are deleted at runtime.$$ *($$Boolean$$, default: `$$true$$`)*
diff --git a/applications/source/rabbit-source/pom.xml b/applications/source/rabbit-source/pom.xml
index 7dccd0360..439bf8c86 100644
--- a/applications/source/rabbit-source/pom.xml
+++ b/applications/source/rabbit-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- rabbit-supplier
+ spring-rabbit-supplier
org.springframework.boot
@@ -65,14 +65,13 @@
rabbit
source
${project.version}
- org.springframework.cloud.fn.supplier.rabbit.RabbitSupplierConfiguration.class
-
-
+ AUTOCONFIGURATION
+ rabbitSupplier
org.springframework.cloud.fn
- rabbit-supplier
+ spring-rabbit-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/s3-source/README.adoc b/applications/source/s3-source/README.adoc
index 588af9e71..77e63f606 100644
--- a/applications/source/s3-source/README.adoc
+++ b/applications/source/s3-source/README.adoc
@@ -126,6 +126,7 @@ $$static$$:: $$$$ *($$String$$, default: `$$$$`)*
$$accelerate-mode-enabled$$:: $$Option to enable using the accelerate endpoint when accessing S3. Accelerate endpoints allow faster transfer of objects by using Amazon CloudFront's globally distributed edge locations.$$ *($$Boolean$$, default: `$$$$`)*
$$checksum-validation-enabled$$:: $$Option to disable doing a validation of the checksum of an object stored in S3.$$ *($$Boolean$$, default: `$$$$`)*
$$chunked-encoding-enabled$$:: $$Option to enable using chunked encoding when signing the request payload for {@link software.amazon.awssdk.services.s3.model.PutObjectRequest} and {@link software.amazon.awssdk.services.s3.model.UploadPartRequest}.$$ *($$Boolean$$, default: `$$$$`)*
+$$cross-region-enabled$$:: $$Enables cross-region bucket access.$$ *($$Boolean$$, default: `$$$$`)*
$$endpoint$$:: $$Overrides the default endpoint.$$ *($$URI$$, default: `$$$$`)*
$$path-style-access-enabled$$:: $$Option to enable using path style access for accessing S3 objects instead of DNS style access. DNS style access is preferred as it will result in better load balancing when accessing S3.$$ *($$Boolean$$, default: `$$$$`)*
$$region$$:: $$Overrides the default region.$$ *($$String$$, default: `$$$$`)*
diff --git a/applications/source/s3-source/pom.xml b/applications/source/s3-source/pom.xml
index b19faea4e..fc919056f 100644
--- a/applications/source/s3-source/pom.xml
+++ b/applications/source/s3-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- s3-supplier
+ spring-s3-supplier
org.springframework.integration
@@ -48,13 +48,13 @@
s3
source
${project.version}
- org.springframework.cloud.fn.supplier.s3.AwsS3SupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ s3Supplier
org.springframework.cloud.fn
- s3-supplier
+ spring-s3-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/sftp-source/pom.xml b/applications/source/sftp-source/pom.xml
index 5b3dc1dba..eee5c1f0e 100644
--- a/applications/source/sftp-source/pom.xml
+++ b/applications/source/sftp-source/pom.xml
@@ -17,11 +17,11 @@
org.springframework.cloud.fn
- sftp-supplier
+ spring-sftp-supplier
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
test
@@ -54,13 +54,13 @@
sftp
source
${project.version}
- org.springframework.cloud.fn.supplier.sftp.SftpSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ sftpSupplier
org.springframework.cloud.fn
- sftp-supplier
+ spring-sftp-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/syslog-source/README.adoc b/applications/source/syslog-source/README.adoc
index 74f6ab2fe..78a690338 100644
--- a/applications/source/syslog-source/README.adoc
+++ b/applications/source/syslog-source/README.adoc
@@ -7,12 +7,12 @@ The syslog source receives SYSLOG packets over UDP, TCP, or both. RFC3164 (BSD)
//tag::configuration-properties[]
$$syslog.supplier.buffer-size$$:: $$the buffer size used when decoding messages; larger messages will be rejected.$$ *($$Integer$$, default: `$$2048$$`)*
-$$syslog.supplier.nio$$:: $$whether or not to use NIO (when supporting a large number of connections).$$ *($$Boolean$$, default: `$$false$$`)*
+$$syslog.supplier.nio$$:: $$Whether to use NIO (when supporting a large number of connections).$$ *($$Boolean$$, default: `$$false$$`)*
$$syslog.supplier.port$$:: $$The port to listen on.$$ *($$Integer$$, default: `$$1514$$`)*
$$syslog.supplier.protocol$$:: $$Protocol used for SYSLOG (tcp or udp).$$ *($$Protocol$$, default: `$$$$`, possible values: `tcp`,`udp`,`both`)*
-$$syslog.supplier.reverse-lookup$$:: $$whether or not to perform a reverse lookup on the incoming socket.$$ *($$Boolean$$, default: `$$false$$`)*
-$$syslog.supplier.rfc$$:: $$'5424' or '3164' - the syslog format according to the RFC; 3164 is aka 'BSD' format.$$ *($$String$$, default: `$$3164$$`)*
-$$syslog.supplier.socket-timeout$$:: $$the socket timeout.$$ *($$Integer$$, default: `$$0$$`)*
+$$syslog.supplier.reverse-lookup$$:: $$Whether to perform a reverse lookup on the incoming socket.$$ *($$Boolean$$, default: `$$false$$`)*
+$$syslog.supplier.rfc$$:: $$The '5424' or '3164' - the syslog format according to the RFC; 3164 is aka 'BSD' format.$$ *($$String$$, default: `$$3164$$`)*
+$$syslog.supplier.socket-timeout$$:: $$The socket timeout.$$ *($$Integer$$, default: `$$0$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/source/syslog-source/pom.xml b/applications/source/syslog-source/pom.xml
index 5bdac00e8..b5294542b 100644
--- a/applications/source/syslog-source/pom.xml
+++ b/applications/source/syslog-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- syslog-supplier
+ spring-syslog-supplier
@@ -43,13 +43,13 @@
syslog
source
${project.version}
- org.springframework.cloud.fn.supplier.syslog.SyslogSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ syslogSupplier
org.springframework.cloud.fn
- syslog-supplier
+ spring-syslog-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/tcp-source/README.adoc b/applications/source/tcp-source/README.adoc
index 6fb002326..b46b1c64f 100644
--- a/applications/source/tcp-source/README.adoc
+++ b/applications/source/tcp-source/README.adoc
@@ -15,11 +15,11 @@ Properties grouped by prefix:
=== tcp
-$$nio$$:: $$Whether or not to use NIO.$$ *($$Boolean$$, default: `$$false$$`)*
+$$nio$$:: $$Whether to use NIO.$$ *($$Boolean$$, default: `$$false$$`)*
$$port$$:: $$The port on which to listen; 0 for the OS to choose a port.$$ *($$Integer$$, default: `$$1234$$`)*
$$reverse-lookup$$:: $$Perform a reverse DNS lookup on the remote IP Address; if false, just the IP address is included in the message headers.$$ *($$Boolean$$, default: `$$false$$`)*
$$socket-timeout$$:: $$The timeout (ms) before closing the socket when no data is received.$$ *($$Integer$$, default: `$$120000$$`)*
-$$use-direct-buffers$$:: $$Whether or not to use direct buffers.$$ *($$Boolean$$, default: `$$false$$`)*
+$$use-direct-buffers$$:: $$Whether to use direct buffers.$$ *($$Boolean$$, default: `$$false$$`)*
=== tcp.supplier
diff --git a/applications/source/tcp-source/pom.xml b/applications/source/tcp-source/pom.xml
index ea2bc8b79..187af1c41 100644
--- a/applications/source/tcp-source/pom.xml
+++ b/applications/source/tcp-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- tcp-supplier
+ spring-tcp-supplier
@@ -43,13 +43,13 @@
tcp
source
${project.version}
- org.springframework.cloud.fn.supplier.tcp.TcpSupplierConfiguration.class
-
+ AUTOCONFIGURATION
+ tcpSupplier
org.springframework.cloud.fn
- tcp-supplier
+ spring-tcp-supplier
org.springframework.cloud.stream.app
diff --git a/applications/source/time-source/pom.xml b/applications/source/time-source/pom.xml
index 47123a874..6b1f91c7f 100644
--- a/applications/source/time-source/pom.xml
+++ b/applications/source/time-source/pom.xml
@@ -17,7 +17,7 @@
org.springframework.cloud.fn
- time-supplier
+ spring-time-supplier
org.springframework.cloud.stream.app
@@ -49,13 +49,14 @@
time
source
${project.version}
- org.springframework.cloud.fn.supplier.time.TimeSupplierConfiguration.class
+ AUTOCONFIGURATION
+ timeSupplier
org.springframework.cloud.fn
- time-supplier
- ${java-functions.version}
+ spring-time-supplier
+
org.springframework.cloud.stream.app
diff --git a/applications/source/twitter-message-source/README.adoc b/applications/source/twitter-message-source/README.adoc
index c5e734be1..7e8e16b27 100644
--- a/applications/source/twitter-message-source/README.adoc
+++ b/applications/source/twitter-message-source/README.adoc
@@ -37,6 +37,7 @@ $$raw-json$$:: $$Enable caching the original (raw) JSON objects as returned by t
=== twitter.message.source
$$count$$:: $$Max number of events to be returned. 20 default. 50 max.$$ *($$Integer$$, default: `$$20$$`)*
+$$enabled$$:: $$Whether to enable Twitter message receiving.$$ *($$Boolean$$, default: `$$false$$`)*
//end::configuration-properties[]
//end::ref-doc[]
diff --git a/applications/source/twitter-message-source/pom.xml b/applications/source/twitter-message-source/pom.xml
index e990e8202..86055bf88 100644
--- a/applications/source/twitter-message-source/pom.xml
+++ b/applications/source/twitter-message-source/pom.xml
@@ -17,26 +17,18 @@
org.springframework.cloud.fn
- twitter-supplier
+ spring-twitter-supplier
org.springframework.cloud.fn
- function-test-support
+ spring-function-test-support
test
org.mock-server
mockserver-netty
- ${mockserver.version}
test
-
- org.mock-server
- mockserver-client-java
- ${mockserver.version}
- test
-
-
@@ -61,16 +53,14 @@
twitter-message
source
${project.version}
-
- org.springframework.cloud.fn.supplier.twitter.message.TwitterMessageSupplierConfiguration.class
-
- twitterMessageSupplier
+ AUTOCONFIGURATION
+ twitterMessagesSupplier
org.springframework.cloud.fn
- twitter-supplier
+ spring-twitter-supplier
diff --git a/applications/source/twitter-message-source/src/test/java/org/springframework/cloud/stream/app/source/twitter/message/TwitterMessageSourceIntegrationTests.java b/applications/source/twitter-message-source/src/test/java/org/springframework/cloud/stream/app/source/twitter/message/TwitterMessageSourceIntegrationTests.java
index 66216d7d5..709bc510e 100644
--- a/applications/source/twitter-message-source/src/test/java/org/springframework/cloud/stream/app/source/twitter/message/TwitterMessageSourceIntegrationTests.java
+++ b/applications/source/twitter-message-source/src/test/java/org/springframework/cloud/stream/app/source/twitter/message/TwitterMessageSourceIntegrationTests.java
@@ -1,5 +1,5 @@
/*
- * Copyright 2020-2021 the original author or authors.
+ * Copyright 2020-2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,6 @@
import java.time.Duration;
import java.util.List;
-import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import com.fasterxml.jackson.core.JsonProcessingException;
@@ -38,16 +37,13 @@
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.cloud.fn.common.twitter.TwitterConnectionProperties;
import org.springframework.cloud.fn.common.twitter.util.TwitterTestUtils;
-import org.springframework.cloud.fn.supplier.twitter.message.TwitterMessageSupplierConfiguration;
import org.springframework.cloud.fn.supplier.twitter.message.TwitterMessageSupplierProperties;
import org.springframework.cloud.stream.binder.test.OutputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Bean;
-import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.Primary;
import org.springframework.messaging.Message;
-import org.springframework.test.util.TestSocketUtils;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockserver.matchers.Times.exactly;
@@ -57,23 +53,21 @@
/**
* @author Christian Tzolov
+ * @author Artem Bilan
*/
public class TwitterMessageSourceIntegrationTests {
- private static final String MOCK_SERVER_IP = "127.0.0.1";
-
- private static final Integer MOCK_SERVER_PORT = TestSocketUtils.findAvailableTcpPort();
-
private static ClientAndServer mockServer;
private static MockServerClient mockClient;
+
private static HttpRequest messageRequest;
@BeforeAll
public static void startServer() {
- mockServer = ClientAndServer.startClientAndServer(MOCK_SERVER_PORT);
- mockClient = new MockServerClient(MOCK_SERVER_IP, MOCK_SERVER_PORT);
+ mockServer = ClientAndServer.startClientAndServer();
+ mockClient = new MockServerClient("localhost", mockServer.getPort());
messageRequest = setExpectation(request()
.withMethod("GET")
@@ -93,15 +87,15 @@ public void twitterMessageSourceTests() throws JsonProcessingException {
.getCompleteConfiguration(TestTwitterMessageSourceApplication.class))
.web(WebApplicationType.NONE)
- .run("--spring.cloud.function.definition=twitterMessageSupplier",
+ .run("--spring.cloud.function.definition=twitterMessagesSupplier",
"--twitter.connection.consumerKey=consumerKey666",
"--twitter.connection.consumerSecret=consumerSecret666",
"--twitter.connection.accessToken=accessToken666",
"--twitter.connection.accessTokenSecret=accessTokenSecret666",
- "--twitter.message.source.count=15",
- "--spring.cloud.stream.poller.fixed-delay=3000")) {
+ "--twitter.message.source.enabled=true",
+ "--twitter.message.source.count=15")) {
TwitterConnectionProperties twitterConnectionProperties = context.getBean(TwitterConnectionProperties.class);
assertThat(twitterConnectionProperties.getConsumerKey()).isEqualTo("consumerKey666");
@@ -109,19 +103,16 @@ public void twitterMessageSourceTests() throws JsonProcessingException {
assertThat(twitterConnectionProperties.getAccessToken()).isEqualTo("accessToken666");
assertThat(twitterConnectionProperties.getAccessTokenSecret()).isEqualTo("accessTokenSecret666");
-// DefaultPollerProperties defaultPollerProperties = context.getBean(DefaultPollerProperties.class);
-// assertThat(defaultPollerProperties.getFixedDelay()).isEqualTo(3000);
-
TwitterMessageSupplierProperties twitterMessageSupplierProperties = context.getBean(TwitterMessageSupplierProperties.class);
assertThat(twitterMessageSupplierProperties.getCount()).isEqualTo(15);
OutputDestination outputDestination = context.getBean(OutputDestination.class);
// Using local region here
- Message message = outputDestination.receive(Duration.ofSeconds(300).toMillis(), "twitterMessageSupplier-out-0");
+ Message message = outputDestination.receive(Duration.ofSeconds(10).toMillis(), "twitterMessagesSupplier-out-0");
assertThat(message).isNotNull();
String payload = new String(message.getPayload());
- List tweets = new ObjectMapper().readValue(payload, List.class);
+ List> tweets = new ObjectMapper().readValue(payload, List.class);
assertThat(tweets).hasSize(4);
mockClient.verify(messageRequest, once());
}
@@ -135,14 +126,12 @@ private static HttpRequest setExpectation(HttpRequest request) {
.withHeaders(
new Header("Content-Type", "application/json; charset=utf-8"),
new Header("Cache-Control", "public, max-age=86400"))
- .withBody(TwitterTestUtils.asString("classpath:/response/messages.json"))
- .withDelay(TimeUnit.SECONDS, 10));
+ .withBody(TwitterTestUtils.asString("classpath:/response/messages.json")));
return request;
}
@SpringBootConfiguration
@EnableAutoConfiguration
- @Import(TwitterMessageSupplierConfiguration.class)
public static class TestTwitterMessageSourceApplication {
@Bean
@@ -152,11 +141,11 @@ public twitter4j.conf.Configuration twitterConfiguration2(TwitterConnectionPrope
Function mockedConfiguration =
toConfigurationBuilder.andThen(
- new TwitterTestUtils().mockTwitterUrls(
- String.format("http://%s:%s", MOCK_SERVER_IP, MOCK_SERVER_PORT)));
+ new TwitterTestUtils().mockTwitterUrls("http://localhost:" + mockServer.getPort()));
return mockedConfiguration.apply(properties).build();
}
+
}
}
diff --git a/applications/source/twitter-search-source/README.adoc b/applications/source/twitter-search-source/README.adoc
index 306de4d2e..42a8443bb 100644
--- a/applications/source/twitter-search-source/README.adoc
+++ b/applications/source/twitter-search-source/README.adoc
@@ -40,11 +40,12 @@ $$raw-json$$:: $$Enable caching the original (raw) JSON objects as returned by t
=== twitter.search
$$count$$:: $$Number of tweets to return per page (e.g. per single request), up to a max of 100.$$ *($$Integer$$, default: `$$100$$`)*
-$$lang$$:: $$Restricts searched tweets to the given language, given by an http://en.wikipedia.org/wiki/ISO_639-1 .$$ *($$String$$, default: `$$$$`)*
+$$enabled$$:: $$Whether to enable Twitter search supplier.$$ *($$Boolean$$, default: `$$false$$`)*
+$$lang$$:: $$Restricts searched tweets to the given language, given by an http://en.wikipedia.org/wiki/ISO_639-1.$$ *($$String$$, default: `$$$$`)*
$$page$$:: $$Number of pages (e.g. requests) to search backwards (from most recent to the oldest tweets) before start the search from the most recent tweets again. The total amount of tweets searched backwards is (page * count)$$ *($$Integer$$, default: `$$3$$`)*
$$query$$:: $$Search tweets by search query string.$$ *($$String$$, default: `$$$$`)*
$$restart-from-most-recent-on-empty-response$$:: $$Restart search from the most recent tweets on empty response. Applied only after the first restart (e.g. when since_id != UNBOUNDED)$$ *($$Boolean$$, default: `$$false$$`)*
-$$result-type$$:: $$Specifies what type of search results you would prefer to receive. The current default is "mixed." Valid values include: mixed : Include both popular and real time results in the response. recent : return only the most recent results in the response popular : return only the most popular results in the response$$ *($$ResultType$$, default: `$$$$`, possible values: `popular`,`mixed`,`recent`)*
+$$result-type$$:: $$Specifies what type of search results you would prefer to receive. The current default is "mixed." Valid values include: mixed : Include both popular and real time results in the response. recent : return only the most recent results in the response popular : return only the most popular results in the response$$ *($$ResultType$$, default: `$$$$`, possible values: `popular`,`mixed`,`recent`)*
$$since$$:: $$If specified, returns tweets with since the given date. Date should be formatted as YYYY-MM-DD.$$ *($$String$$, default: `$$$$`)*
=== twitter.search.geocode
diff --git a/applications/source/twitter-search-source/pom.xml b/applications/source/twitter-search-source/pom.xml
index 72912f936..dfc5dd783 100644
--- a/applications/source/twitter-search-source/pom.xml
+++ b/applications/source/twitter-search-source/pom.xml
@@ -17,23 +17,16 @@