Skip to content

fortran/mpif-h: fix MPI_Alltoallw() binding #6813

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

ggouaillardet
Copy link
Contributor

  • ignore sendcounts, sendispls and sendtypes arguments when MPI_IN_PLACE is used
  • use the right size when an inter-communicator is used.

Thanks Markus Geimer for reporting this.

Refs. #5459

Signed-off-by: Gilles Gouaillardet [email protected]

 - ignore sendcounts, sendispls and sendtypes arguments when MPI_IN_PLACE is used
 - use the right size when an inter-communicator is used.

Thanks Markus Geimer for reporting this.

Refs. open-mpi#5459

Signed-off-by: Gilles Gouaillardet <[email protected]>
@ggouaillardet ggouaillardet force-pushed the topic/alltoallw_inplace_mpifh branch from 31c8aa9 to cdaed89 Compare July 13, 2019 13:35
@jsquyres jsquyres added the NEWS label Jul 15, 2019
@jsquyres jsquyres merged commit 0df0e5c into open-mpi:master Jul 15, 2019
@jsquyres
Copy link
Member

@ggouaillardet Does this need to get cherry picked to release branches?

@ggouaillardet
Copy link
Contributor Author

@jsquyres yes, this has to be backported to the release branches.

  • good news is this can be seen as a corner case
  • bad news is the reported issue is the top of the iceberg. mpif-h bindings do not ignore parameters that should be ignored. Also, non blocking collectives use an intermediate malloc'ed array that is free'd right after the PMPI subroutine returns, which is incorrect since it can only be free'd after the non blocking operation completes (or is free'd if this is a persistent non blocking collective).

I am working on a patch that keeps growing, and I'd rather backport the fixes all at once.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants