-
Notifications
You must be signed in to change notification settings - Fork 926
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run spark-rapids-jni CI #17781
base: branch-25.04
Are you sure you want to change the base?
Run spark-rapids-jni CI #17781
Conversation
Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually. Contributors can view more details about this message here. |
/ok to test |
/ok to test |
/ok to test |
/ok to test |
/ok to test |
/ok to test |
/ok to test |
This reverts commit 6625931.
I've removed the cpp codeowner review requests since those were just the result of a test commit. I'm happy to provide the ci codeowner review if needed when things are ready though. |
@KyleFromNVIDIA let's move this to 25.04. 25.02 is freezing today so there won't be any major changes going in that would break Spark anyway. We can make this one of the first PRs that we merge in 25.04. |
- name: "Build spark-rapids-jni" | ||
run: | | ||
mkdir target | ||
GPU_ARCHS=90 LIBCUDF_DEPENDENCY_MODE=latest USE_GDS=on scl enable gcc-toolset-11 build/buildcpp.sh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This already looks great.
We can improve it by actually executing the exact command with Maven build a regular spark-rapids-jni build would
./build/build-in-docker verify -DskipTests -DGPU_ARCHS=90 -Dlibcudf.dependency.mode=latest
It can be a follow-up change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will that work when it's already inside a Docker container though?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It may work but good point that it is not desired to recurse into a nested container. What I mean is the Maven command following run-in-docker
https://github.com/NVIDIA/spark-rapids-jni/blob/6883364a87dec972ce9e51027e716eba860af568/build/build-in-docker#L45C1-L50
We can change the script such that the run-in-docker part is skip-able.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I'll do that in a follow-up PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving with a couple comments/questions.
spark-rapids-jni: | ||
needs: changed-files | ||
uses: ./.github/workflows/spark-rapids-jni.yaml | ||
if: fromJSON(needs.changed-files.outputs.changed_file_groups).test_spark |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we will always want to test JNI and Spark-RAPIDS with the same changed-files criteria. I'm a bit confused why this conversation ended up with two copies of the same criteria. #17781 (comment)
if: fromJSON(needs.changed-files.outputs.changed_file_groups).test_spark | |
if: fromJSON(needs.changed-files.outputs.changed_file_groups).test_java |
spark-rapids-jni-build: | ||
runs-on: linux-amd64-cpu8 | ||
container: | ||
image: rapidsai/ci-spark-rapids-jni:rockylinux8-cuda12.2.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to use a newer CUDA version?
FYI @KyleFromNVIDIA Github has specific keywords that it recognizes as indicating a PR should close an issue. I've updated your PR description accordingly. |
Description
Run the CI for
spark-rapids-jni
to ensure that we don't break their build.Resolves #17337
Checklist