Skip to content

Commit

Permalink
JAVA-3078: Provide support for building with Java 11 and Java 17
Browse files Browse the repository at this point in the history
Details:
- Update to api-plumber-doclet 2.0.0
- Add org.jetbrains:annotations as additionalDependency for core-shaded
- Refactor table summary tag as caption element to conform to HTML5
- Refactor <h3> usage to <h2> to conform to heading tag ordering for HTML5
- Explicitly define annotationProcessorPaths for mapper-processor, slf4j-nop and gremlin-core
- [java11+] Upgrade errorprone to 2.19.1
- [java11+] Update custom javadoc leaks tag to full string: leaks-private-api
- Add build-java-8 profile for compiling with java 8
- Add build-java-11 profile for compiling with java 11
- [java11+] Exclude many new error prone checks enabled by default in the new version (see JAVA-3102)
- [java11+] Use release=8 instead of source/target=1.8 to automatically select correct bootstrap path
- [java11+] Use fork=true with maven-compiler-plugin to pick up compilerArgs
- [java11+] Load error-prone plugin using annotationProcessorPaths, per documentation guidance
- Refactor Jenkinsfile to compile project with the Java version selected in the pipeline matrix
- Remove test-jdk profiles and surefire/failsafe JVM overrides added in JAVA-3042
- Update org.apache.felix.framework to 7.0.1 to support java17 without forking new JVM (FELIX-6287)
- Update commons-configuration2 in OSGi BundleOptions to 2.9.0 for java11
- [java11+] Set JAVA_HOME=JAVA8_HOME in CcmBridge if using DSE which only supports java8
  • Loading branch information
hhughes committed Jul 29, 2023
1 parent ec93ef9 commit 59f3d76
Show file tree
Hide file tree
Showing 19 changed files with 204 additions and 155 deletions.
55 changes: 10 additions & 45 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,36 +19,13 @@ def initializeEnvironment() {
env.MAVEN_HOME = "${env.HOME}/.mvn/apache-maven-3.3.9"
env.PATH = "${env.MAVEN_HOME}/bin:${env.PATH}"

/*
* As of JAVA-3042 JAVA_HOME is always set to JDK8 and this is currently necessary for mvn compile and DSE Search/Graph.
* To facilitate testing with JDK11/17 we feed the appropriate JAVA_HOME into the maven build via commandline.
*
* Maven command-line flags:
* - -DtestJavaHome=/path/to/java/home: overrides JAVA_HOME for surefire/failsafe tests, defaults to environment JAVA_HOME.
* - -Ptest-jdk-N: enables profile for running tests with a specific JDK version (substitute N for 8/11/17).
*
* Note test-jdk-N is also automatically loaded based off JAVA_HOME SDK version so testing with an older SDK is not supported.
*
* Environment variables:
* - JAVA_HOME: Path to JDK used for mvn (all steps except surefire/failsafe), Cassandra, DSE.
* - JAVA8_HOME: Path to JDK8 used for Cassandra/DSE if ccm determines JAVA_HOME is not compatible with the chosen backend.
* - TEST_JAVA_HOME: PATH to JDK used for surefire/failsafe testing.
* - TEST_JAVA_VERSION: TEST_JAVA_HOME SDK version number [8/11/17], used to configure test-jdk-N profile in maven (see above)
*/

env.JAVA_HOME = sh(label: 'Get JAVA_HOME',script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba which ${JABBA_VERSION}''', returnStdout: true).trim()
env.JAVA8_HOME = sh(label: 'Get JAVA8_HOME',script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba which 1.8''', returnStdout: true).trim()

env.TEST_JAVA_HOME = sh(label: 'Get TEST_JAVA_HOME',script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba which ${JABBA_VERSION}''', returnStdout: true).trim()
env.TEST_JAVA_VERSION = sh(label: 'Get TEST_JAVA_VERSION',script: '''#!/bin/bash -le
echo "${JABBA_VERSION##*.}"''', returnStdout: true).trim()

sh label: 'Download Apache CassandraⓇ or DataStax Enterprise',script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba use 1.8
Expand Down Expand Up @@ -77,23 +54,21 @@ ENVIRONMENT_EOF
set +o allexport
. ${JABBA_SHELL}
jabba use 1.8
jabba use ${JABBA_VERSION}
java -version
mvn -v
printenv | sort
'''
}

def buildDriver(jabbaVersion) {
withEnv(["BUILD_JABBA_VERSION=${jabbaVersion}"]) {
sh label: 'Build driver', script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba use ${BUILD_JABBA_VERSION}
def buildDriver() {
sh label: 'Build driver', script: '''#!/bin/bash -le
. ${JABBA_SHELL}
jabba use ${JABBA_VERSION}
mvn -B -V install -DskipTests -Dmaven.javadoc.skip=true
'''
}
mvn -B -V install -DskipTests -Dmaven.javadoc.skip=true
'''
}

def executeTests() {
Expand All @@ -104,13 +79,7 @@ def executeTests() {
set +o allexport
. ${JABBA_SHELL}
jabba use 1.8
if [ "${JABBA_VERSION}" != "1.8" ]; then
SKIP_JAVADOCS=true
else
SKIP_JAVADOCS=false
fi
jabba use ${JABBA_VERSION}
INTEGRATION_TESTS_FILTER_ARGUMENT=""
if [ ! -z "${INTEGRATION_TESTS_FILTER}" ]; then
Expand All @@ -119,11 +88,8 @@ def executeTests() {
printenv | sort
mvn -B -V ${INTEGRATION_TESTS_FILTER_ARGUMENT} -T 1 verify \
-Ptest-jdk-${TEST_JAVA_VERSION} \
-DtestJavaHome=${TEST_JAVA_HOME} \
-DfailIfNoTests=false \
-Dmaven.test.failure.ignore=true \
-Dmaven.javadoc.skip=${SKIP_JAVADOCS} \
-Dccm.version=${CCM_CASSANDRA_VERSION} \
-Dccm.dse=${CCM_IS_DSE} \
-Dproxy.path=${HOME}/proxy \
Expand Down Expand Up @@ -459,7 +425,7 @@ pipeline {
}
stage('Build-Driver') {
steps {
buildDriver('default')
buildDriver()
}
}
stage('Execute-Tests') {
Expand Down Expand Up @@ -573,8 +539,7 @@ pipeline {
}
stage('Build-Driver') {
steps {
// Jabba default should be a JDK8 for now
buildDriver('default')
buildDriver()
}
}
stage('Execute-Tests') {
Expand Down
5 changes: 5 additions & 0 deletions core-shaded/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -283,6 +283,11 @@
<artifactId>esri-geometry-api</artifactId>
<version>1.2.1</version>
</additionalDependency>
<additionalDependency>
<groupId>org.jetbrains</groupId>
<artifactId>annotations</artifactId>
<version>24.0.1</version>
</additionalDependency>
</additionalDependencies>
</configuration>
</execution>
Expand Down
1 change: 0 additions & 1 deletion core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -248,7 +248,6 @@
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<jvm>${testing.jvm}/bin/java</jvm>
<argLine>${mockitoopens.argline}</argLine>
<threadCount>1</threadCount>
<properties>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,8 @@
*
* Examples:
*
* <table summary="examples">
* <table>
* <caption>examples</caption>
* <tr><th>Create statement</th><th>Case-sensitive?</th><th>CQL id</th><th>Internal id</th></tr>
* <tr><td>CREATE TABLE t(foo int PRIMARY KEY)</td><td>No</td><td>foo</td><td>foo</td></tr>
* <tr><td>CREATE TABLE t(Foo int PRIMARY KEY)</td><td>No</td><td>foo</td><td>foo</td></tr>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
* a reasonable trade-off if the cardinality stays low. This class provides a way to emulate this
* behavior on the client side.
*
* <h3>Performance considerations</h3>
* <h2>Performance considerations</h2>
*
* For each page that you want to retrieve:
*
Expand All @@ -69,7 +69,7 @@
* OffsetPager.Page&lt;Row&gt; page5 = pager.getPage(rs, 5);
* </pre>
*
* <h3>Establishing application-level guardrails</h3>
* <h2>Establishing application-level guardrails</h2>
*
* Linear performance should be fine for the values typically encountered in real-world
* applications: for example, if the page size is 25 and users never go past page 10, the worst case
Expand All @@ -79,7 +79,7 @@
* maximum, so that an attacker can't inject a large value that could potentially fetch millions of
* rows.
*
* <h3>Relation with protocol-level paging</h3>
* <h2>Relation with protocol-level paging</h2>
*
* Protocol-level paging refers to the ability to split large response into multiple network chunks:
* see {@link Statement#setPageSize(int)} and {@code basic.request.page-size} in the configuration.
Expand Down
34 changes: 31 additions & 3 deletions integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -219,6 +219,37 @@
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>process-test-annotations</id>
<phase>generate-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<annotationProcessorPaths>
<path>
<groupId>com.datastax.oss</groupId>
<artifactId>java-driver-mapper-processor</artifactId>
<version>${project.version}</version>
</path>
<path>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-nop</artifactId>
<version>${slf4j.version}</version>
</path>
<path>
<groupId>org.apache.tinkerpop</groupId>
<artifactId>gremlin-core</artifactId>
<version>${tinkerpop.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
Expand All @@ -229,7 +260,6 @@
<goal>integration-test</goal>
</goals>
<configuration>
<jvm>${testing.jvm}/bin/java</jvm>
<groups>com.datastax.oss.driver.categories.ParallelizableTests</groups>
<parallel>classes</parallel>
<threadCountClasses>8</threadCountClasses>
Expand All @@ -246,7 +276,6 @@
<excludedGroups>com.datastax.oss.driver.categories.ParallelizableTests, com.datastax.oss.driver.categories.IsolatedTests</excludedGroups>
<summaryFile>${project.build.directory}/failsafe-reports/failsafe-summary-serial.xml</summaryFile>
<skipITs>${skipSerialITs}</skipITs>
<jvm>${testing.jvm}/bin/java</jvm>
</configuration>
</execution>
<execution>
Expand All @@ -262,7 +291,6 @@
<summaryFile>${project.build.directory}/failsafe-reports/failsafe-summary-isolated.xml</summaryFile>
<skipITs>${skipIsolatedITs}</skipITs>
<argLine>${blockhound.argline}</argLine>
<jvm>${testing.jvm}/bin/java</jvm>
</configuration>
</execution>
<execution>
Expand Down
1 change: 0 additions & 1 deletion mapper-runtime/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,6 @@
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<jvm>${testing.jvm}/bin/java</jvm>
<threadCount>1</threadCount>
<properties>
<!-- tell TestNG not to run jUnit tests -->
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
* }
* </pre>
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* The method can operate either on an entity instance, or on a primary key (partition key +
* clustering columns).
Expand Down Expand Up @@ -75,7 +75,7 @@
* parameter. It will be applied to the statement before execution. This allows you to customize
* certain aspects of the request (page size, timeout, etc) at runtime.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* The method can return:
*
Expand Down Expand Up @@ -125,7 +125,7 @@
* practical purpose for that since those queries always return {@code wasApplied = true} and an
* empty result set.
*
* <h3>Target keyspace and table</h3>
* <h2>Target keyspace and table</h2>
*
* If a keyspace was specified when creating the DAO (see {@link DaoFactory}), then the generated
* query targets that keyspace. Otherwise, it doesn't specify a keyspace, and will only work if the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
* <p>It does not perform a query. Instead, those methods are intended for cases where you already
* have a query result, and just need the conversion logic.
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* The method must have a single parameter. The following types are allowed:
*
Expand All @@ -67,7 +67,7 @@
* The data must match the target entity: the generated code will try to extract every mapped
* property, and fail if one is missing.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* The method can return:
*
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
* }
* </pre>
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* The entity class must be specified with {@link #entityClass()}.
*
Expand Down Expand Up @@ -90,12 +90,12 @@
* parameter. It will be applied to the statement before execution. This allows you to customize
* certain aspects of the request (page size, timeout, etc) at runtime.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* <p>The method can return {@code void}, a void {@link CompletionStage} or {@link
* CompletableFuture}, or a {@link ReactiveResultSet}.
*
* <h3>Target keyspace and table</h3>
* <h2>Target keyspace and table</h2>
*
* <p>If a keyspace was specified when creating the DAO (see {@link DaoFactory}), then the generated
* query targets that keyspace. Otherwise, it doesn't specify a keyspace, and will only work if the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
* }
* </pre>
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* The first parameter must be the entity to insert.
*
Expand All @@ -64,7 +64,7 @@
* parameter. It will be applied to the statement before execution. This allows you to customize
* certain aspects of the request (page size, timeout, etc) at runtime.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* The method can return:
*
Expand Down Expand Up @@ -120,7 +120,7 @@
* <li>a {@linkplain MapperResultProducer custom type}.
* </ul>
*
* <h3>Target keyspace and table</h3>
* <h2>Target keyspace and table</h2>
*
* If a keyspace was specified when creating the DAO (see {@link DaoFactory}), then the generated
* query targets that keyspace. Otherwise, it doesn't specify a keyspace, and will only work if the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
*
* This is the equivalent of what was called "accessor methods" in the driver 3 mapper.
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* The query string provided in {@link #value()} will typically contain CQL placeholders. The
* method's parameters must match those placeholders: same name and a compatible Java type.
Expand All @@ -68,7 +68,7 @@
* parameter. It will be applied to the statement before execution. This allows you to customize
* certain aspects of the request (page size, timeout, etc) at runtime.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* The method can return:
*
Expand Down Expand Up @@ -98,7 +98,7 @@
* <li>a {@linkplain MapperResultProducer custom type}.
* </ul>
*
* <h3>Target keyspace and table</h3>
* <h2>Target keyspace and table</h2>
*
* To avoid hard-coding the keyspace and table name, the query string supports 3 additional
* placeholders: {@code ${keyspaceId}}, {@code ${tableId}} and {@code ${qualifiedTableId}}. They get
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
* }
* </pre>
*
* <h3>Parameters</h3>
* <h2>Parameters</h2>
*
* If {@link #customWhereClause()} is empty, the mapper defaults to a selection by primary key
* (partition key + clustering columns). The method's parameters must match the types of the primary
Expand Down Expand Up @@ -85,7 +85,7 @@
* parameter. It will be applied to the statement before execution. This allows you to customize
* certain aspects of the request (page size, timeout, etc) at runtime.
*
* <h3>Return type</h3>
* <h2>Return type</h2>
*
* <p>In all cases, the method can return:
*
Expand Down Expand Up @@ -130,7 +130,7 @@
* <li>a {@linkplain MapperResultProducer custom type}.
* </ul>
*
* <h3>Target keyspace and table</h3>
* <h2>Target keyspace and table</h2>
*
* If a keyspace was specified when creating the DAO (see {@link DaoFactory}), then the generated
* query targets that keyspace. Otherwise, it doesn't specify a keyspace, and will only work if the
Expand Down
Loading

0 comments on commit 59f3d76

Please sign in to comment.