Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coarray-fortran #9

Open
zerothi opened this issue May 10, 2022 · 7 comments
Open

Coarray-fortran #9

zerothi opened this issue May 10, 2022 · 7 comments

Comments

@zerothi
Copy link

zerothi commented May 10, 2022

In this workgroup it seems that there is an interest in aligning coarray with MPI?

I would really 2nd this!

As I see it, coarray in fortran hasn't been adopted since it puts too many restrictions on the program. I.e. the implementation of libraries in coarrays makes it very hard to use in hosting programs relying on MPI.
A way to transfer a coarray set to an MPI group/communicator for 1-1 correspondence would be great.

If this is not the place, then sorry for the blurp! :)

@jeffhammond
Copy link
Member

You might get more interest from https://github.com/mpiwg-rma/rma-issues/issues because MPI RMA and coarray Fortran are both one-sided models, but I don't think it matters that much.

@zerothi
Copy link
Author

zerothi commented May 10, 2022

Does that mean I should replicate the issue there?

I suspected that CAF did not belong there since, while it is indeed RMA, it is more of a fundamental problem (CAF dimension -> communicator) than it is a RMA problem. Not that I wouldn't want to duplicate it? ;)

@jeffhammond
Copy link
Member

Regarding coarray teams and MPI groups/communicators, there are two issues:

  1. What is the relationship between coarray images and MPI processes/ranks?
  2. What is the relationship between FORM TEAM and MPI_Comm_create_group, for example?

The hard part is 1. In practice, Intel Fortran, Cray Fortran and GCC with OpenCoarrays all have an equivalent execution model for coarray images and MPI processes, but nothing guarantees this. I don't expect this to ever be standardized by Fortran or MPI, so it's going to be an implementation detail the user/application needs to query/verify.

As for 2, it seems that FORM TEAM is equivalent to MPI_Comm_split, at least for the simpler use cases.

@jeffhammond
Copy link
Member

No need to duplicate it. As long as you are aware of the RMA connections, that's sufficient.

@zerothi
Copy link
Author

zerothi commented May 10, 2022

I completely agree that it isn't easy! And it will most likely put restrictions on the way the CAF images are allocated.

The main problem in adoption of CAF is that it is not portable as library (my perspective ;-0). It is difficult, if not impossible to pass a distributed memory array from an MPI application to a CAF library... :(

Simultaneously CAF could in principle on shared mem machines use OpenMP under the hood. So I agree it ain't trivial, but working towards such a goal, I think would be ideal! ;)

@jeffhammond
Copy link
Member

If you have data that isn't in coarrays, then you'll have to copy it into that when going from MPI to coarrays, but if you allocate memory with coarrays, there's no reason you can't use that with MPI libraries.

Coarray images can't be threads without compiler hacks to have all the global mutable state be thread-private, which basically means processes.

@jeffhammond
Copy link
Member

But you might also want to consider just using MPI RMA instead of coarrays. RMA isn't perfect, but it's a lot more portable than coarrays right now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants