Skip to content

docs: update verbiage for using srun #11746

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 21, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 19 additions & 18 deletions docs/launching-apps/slurm.rst
Original file line number Diff line number Diff line change
Expand Up @@ -61,41 +61,42 @@ that information directly from Slurm at run time.
Using Slurm's "direct launch" functionality
-------------------------------------------

Assuming that Slurm installed its Open MPI plugin, you can use
Assuming that Slurm was configured with its PMIx plugin, you can use
``srun`` to "direct launch" Open MPI applications without the use of
Open MPI's ``mpirun`` command.

.. note:: Using direct launch can be *slightly* faster when launching
very, very large MPI processes (i.e., thousands or millions of MPI
processes in a single job). But it has significantly fewer
features than Open MPI's ``mpirun``.
First, you must ensure that Slurm was built and installed with PMIx
support. This can determined as shown below:

First, you must ensure that Slurm was built and installed with PMI-2
support.
.. code-block:: sh

shell$ srun --mpi=list
MPI plugin types are...
none
pmi2
pmix
specific pmix plugin versions available: pmix_v4

.. note:: Please ask your friendly neighborhood Slurm developer to
support PMIx. PMIx is the current generation of run-time
support API; PMI-2 is the legacy / antiquated API. Open MPI
*only* supports PMI-2 for Slurm.
The output from ``srun`` may vary somewhat depending on the version of Slurm installed.
If PMIx is not present in the output, then you will not be able to use srun
to launch Open MPI applications.

Next, ensure that Open MPI was configured ``--with-pmi=DIR``, where
``DIR`` is the path to the directory where Slurm's ``pmi2.h`` is
located.
.. note:: PMI-2 is not supported in Open MPI 5.0.0 and later releases.

Open MPI applications can then be launched directly via the ``srun``
command. For example:
Provided the Slurm installation includes the PMIx plugin, Open MPI applications
can then be launched directly via the ``srun`` command. For example:

.. code-block:: sh

shell$ srun -N 4 mpi-hello-world
shell$ srun -N 4 --mpi=pmix mpi-hello-world

Or you can use ``sbatch`` with a script:

.. code-block:: sh

shell$ cat my_script.sh
#!/bin/sh
srun mpi-hello-world
srun --mpi=pmix mpi-hello-world
shell$ sbatch -N 4 my_script.sh
srun: jobid 1235 submitted
shell$
Expand Down