Skip to content

Only pickup OMPI_MCA_ prefixed envars #3288

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

Only pickup OMPI_MCA_ prefixed envars #3288

wants to merge 1 commit into from

Conversation

rhc54
Copy link
Contributor

@rhc54 rhc54 commented Apr 5, 2017

Signed-off-by: Ralph Castain [email protected]

@jjhursey
Copy link
Member

jjhursey commented Apr 5, 2017

Though I don't see anything wrong with this PR, it should be noted that this is a change in behavior going into master. It is also well documented behavior that we pass any variable prefixed with OMPI_ in the man mpirun page.

I'd like to see more discussion on this PR before it's merged. Let's put it on the agenda for the next teleconf.

@hppritcha
Copy link
Member

@gpaulsen please add this to agenda for the next devel core con call.

@rhc54
Copy link
Contributor Author

rhc54 commented Apr 5, 2017

I don't know who put that in there, but it is simply wrong. We never used to pickup anything other than OMPI_MCA_, which is the intended behavior. If you look at the code, it expressly states that we will pickup and forward MCA params, not things that are prefixed OMPI_ for one very simple reason - we set OMPI_foo variables as a courtesy so MPI procs can find specific values prior to calling MPI_Init. Thus, they are specific to the proc in which we set them. Forwarding them would cause errors.

@hppritcha
Copy link
Member

Well according to git blame Josh Hursey added the OMPI* to the relevant section of the man page in 2006.

@jjhursey
Copy link
Member

jjhursey commented Apr 6, 2017

I tell ya that guy is trouble 😄 We should try to find a way to blame @jsquyres though.

@jsquyres
Copy link
Member

Should this PR be amended to also update the man page?

Or, since the genie has escaped the bottle, do we have to live with the fact that OMPI exports OMPI_* env vars?

Or, since @rhc54 mentions that that is problematic, should we make another approved prefix for variables that can be exported (e.g., OMPI_EXPORT_*)?

@gpaulsen
Copy link
Member

So, we've been relying on OMPI_* envs being exported to the environment, so we're prefer to keep it that way, but we haven't seen any issues with that yet, and we could rework this if it's causing problems.

@rhc54
Copy link
Contributor Author

rhc54 commented Apr 11, 2017

Could you give some idea as to what you are forwarding that isn't an MCA param, and why?

@rhc54
Copy link
Contributor Author

rhc54 commented Apr 11, 2017

Note that we aren't talking about envars that we export to the client procs - we are talking about OMPI_ vars that are not MCA params being picked up by mpirun and forwarded to remote procs.

@gpaulsen
Copy link
Member

So, we have an mpirun wrapper script that needs to propagate some data to the ranks.
We're working to move all of that into MCA parameters as our longer term solution (also to work well with direct launch without mpirun), but there are a few that we haven't done yet.
So, we have no objection to fixing this, and we've added a mechanism to "export other stuff" OMPI_MCA_mca_base_env_list_distro=var,var,var,... so we're really find with turning off the OMPI_* propagation (assuming we keep MCA intact), it will just cause us to do the right thing sooner. :)

@gpaulsen
Copy link
Member

We discussed this issue (Forwarding all of OMPI_*) in the web-ex today. There are a lot of facets to changing this behavior, and we'll continue to discuss in next week's meeting (4/18).

@rhc54 rhc54 closed this May 31, 2017
@rhc54 rhc54 deleted the topic/async branch May 31, 2017 14:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants