-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add miniapps as tests #1112
base: master
Are you sure you want to change the base?
Add miniapps as tests #1112
Conversation
cscs-ci run |
b395f89
to
9900e70
Compare
cscs-ci run |
1 similar comment
cscs-ci run |
7619183
to
7a85649
Compare
cscs-ci run |
1 similar comment
cscs-ci run |
7041343
to
ed29663
Compare
cscs-ci run |
1 similar comment
cscs-ci run |
62b54e9
to
f7d52d6
Compare
cscs-ci run |
1 similar comment
cscs-ci run |
8533a59
to
71eac9d
Compare
cscs-ci run |
1 similar comment
cscs-ci run |
cscs-ci run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not happy about spreading pika::wait
s everywhere in the miniapps.
I'd prefer another solution.
Would scheduling and sync_wait
ing an Ibarrier work?
This reverts commit 0755556.
1ca0907
to
2094d9f
Compare
cscs-ci run |
…ete on CommunicatorGrid
2094d9f
to
eee19fd
Compare
cscs-ci run |
One more hang after the end of the hermitian multiplication test: https://gitlab.com/cscs-ci/ci-testing/webhook-ci/mirrors/4700071344751697/7514005670787789/-/jobs/7914784871. I've restarted the job. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
cscs-ci run |
02bf96a
to
9b71b31
Compare
cscs-ci run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
cscs-ci run |
cscs-ci run |
Adds the algorithms miniapps as tests. Does not add
miniapp_laset
as it's not really testing anything too useful, and it messes with thecheck-threads
setup (which checks if OS threads are being spawned that shouldn't be spawned). The miniapps are run with 6 ranks,--grid-rows 3 --grid-cols 2
, and otherwise default options.This splits
DLAF_addTest
into two separate functions: one which adds a test given a target (DLAF_addTargetTest
) which is used to add the miniapps since they already have CMake targets, andDLAF_addTest
which creates an executable and then callsDLAF_addTargetTest
. The behaviour ofDLAF_addTest
remains unchanged.This also adds a
CATEGORY
parameter that can be added to tests as a label. This is added alongside theRANK_*
labels. TheCATEGORY
defaults toUNIT
for "unit tests", and I set it toMINIAPP
for the miniapps. I'm fine with making the choice explicit and/or renaming/adding the categories. I then use theRANK
andCATEGORY
labels to generate jobs the same wayRANK
is used currently onmaster
. Combinations that have no tests are not added as CI jobs.Finally, many miniapps were hanging on the CUDA configurations, where they run with only one worker thread and one MPI thread. I've changed the
waitLocalTiles
calls topika::wait
calls beforeMPI_Barrier
is called to make sure that really nothing is running anymore and deadlocks are avoided. I've only changed them topika::wait
after the algorithms are called, but for consistency it might make sense to usepika::wait
anywhere beforeMPI_Barrier
is called?