|
| 1 | +ROCm |
| 2 | +==== |
| 3 | + |
| 4 | +ROCm is the name of the software stack used by AMD GPUs. It includes |
| 5 | +the ROCm Runtime (ROCr), the HIP programming model, and numerous |
| 6 | +numerical and machine learning libraries tuned for the AMD Instinct |
| 7 | +accelerators. More information can be found at the following |
| 8 | +`AMD webpages <https://www.amd.com/en/graphics/servers-solutions-rocm>`_ |
| 9 | + |
| 10 | + |
| 11 | +Building Open MPI with ROCm support |
| 12 | +------------------------------------------------------------------------ |
| 13 | + |
| 14 | +ROCm-aware support means that the MPI library can send and receive |
| 15 | +data from AMD GPU device buffers directly. As of today, ROCm support |
| 16 | +is available through UCX. While other communication transports might |
| 17 | +work as well, UCX is the only transport formally supported in Open MPI |
| 18 | +|ompi_ver| for ROCm devices. |
| 19 | + |
| 20 | +Since UCX will be providing the ROCm support, it is important to |
| 21 | +ensure that UCX itself is built with ROCm support. |
| 22 | + |
| 23 | +To see if your UCX library was built with ROCm support, run the |
| 24 | +following command: |
| 25 | + |
| 26 | +.. code-block:: sh |
| 27 | +
|
| 28 | + # Check if ucx was built with ROCm support |
| 29 | + shell$ ucx_info -v |
| 30 | +
|
| 31 | + # configured with: --with-rocm=/opt/rocm --without-knem --without-cuda |
| 32 | +
|
| 33 | +If you need to build the UCX library yourself to include ROCm support, |
| 34 | +please see the UCX documentation for `building UCX with Open MPI: |
| 35 | +<https://openucx.readthedocs.io/en/master/running.html#openmpi-with-ucx>`_ |
| 36 | + |
| 37 | +It should look something like: |
| 38 | + |
| 39 | +.. code-block:: sh |
| 40 | +
|
| 41 | + # Configure UCX with ROCm support |
| 42 | + shell$ cd ucx |
| 43 | + shell$ ./configure --prefix=/path/to/ucx-rocm-install \ |
| 44 | + --with-rocm=/opt/rocm --without-knem |
| 45 | +
|
| 46 | + # Configure Open MPI with UCX and ROCm support |
| 47 | + shell$ cd ompi |
| 48 | + shell$ ./configure --with-rocm=/opt/rocm \ |
| 49 | + --with-ucx=/path/to/ucx-rocm-install \ |
| 50 | + <other configure params> |
| 51 | +
|
| 52 | +///////////////////////////////////////////////////////////////////////// |
| 53 | + |
| 54 | +Checking that Open MPI has been built with ROCm support |
| 55 | +-------------------------------------------------------------------------- |
| 56 | + |
| 57 | +Verify that Open MPI has been built with ROCm using the |
| 58 | +:ref:`ompi_info(1) <man1-ompi_info>` command: |
| 59 | + |
| 60 | +.. code-block:: sh |
| 61 | +
|
| 62 | + # Use ompi_info to verify ROCm support in Open MPI |
| 63 | + shell$ ./ompi_info |grep "MPI extensions" |
| 64 | + MPI extensions: affinity, cuda, ftmpi, rocm |
| 65 | +
|
| 66 | +///////////////////////////////////////////////////////////////////////// |
| 67 | + |
| 68 | + |
| 69 | +Using ROCm-aware UCX with Open MPI |
| 70 | +-------------------------------------------------------------------------- |
| 71 | + |
| 72 | +If UCX and Open MPI have been configured with ROCm support, specifying |
| 73 | +the UCX pml component is sufficient to take advantage of the ROCm |
| 74 | +support in the libraries. For example, the command to execute the |
| 75 | +``osu_latency`` benchmark from the `OSU benchmarks |
| 76 | +<https://mvapich.cse.ohio-state.edu/benchmarks>`_ with ROCm buffers |
| 77 | +using Open MPI and UCX ROCm support is something like this: |
| 78 | + |
| 79 | +.. code-block:: |
| 80 | +
|
| 81 | + shell$ mpirun -n 2 --mca pml ucx \ |
| 82 | + ./osu_latency -d rocm D D |
| 83 | +
|
| 84 | +Note: some additional configure flags are required to compile the OSU |
| 85 | +benchmark to support ROCm buffers. Please refer to the `UCX ROCm |
| 86 | +instructions |
| 87 | +<https://github.com/openucx/ucx/wiki/Build-and-run-ROCM-UCX-OpenMPI>`_ |
| 88 | +for details. |
| 89 | + |
| 90 | + |
| 91 | +///////////////////////////////////////////////////////////////////////// |
| 92 | + |
| 93 | +Runtime querying of ROCm support in Open MPI |
| 94 | +------------------------------------------------------------------------ |
| 95 | + |
| 96 | +Starting with Open MPI v5.0.0 :ref:`MPIX_Query_rocm_support(3) |
| 97 | +<mpix_query_rocm_support>` is available as an extension to check |
| 98 | +the availability of ROCm support in the library. To use the |
| 99 | +function, the code needs to include ``mpi-ext.h``. Note that |
| 100 | +``mpi-ext.h`` is an Open MPI specific header file. |
| 101 | + |
| 102 | +///////////////////////////////////////////////////////////////////////// |
| 103 | + |
| 104 | +Collective component supporting ROCm device memory |
| 105 | +--------------------------------------------------------------------------- |
| 106 | + |
| 107 | +The `UCC <https://github.com/openucx/ucc>`_ based collective component |
| 108 | +in Open MPI can be configured and compiled to include ROCm support. |
| 109 | + |
| 110 | +An example for configure UCC and Open MPI with ROCm is shown below: |
| 111 | + |
| 112 | +.. code-block:: |
| 113 | +
|
| 114 | + #Configure and compile UCC with ROCm support |
| 115 | + shell$ cd ucc |
| 116 | + shell$ ./configure --with-rocm=/opt/rocm \ |
| 117 | + --with-ucx=/path/to/ucx-rocm-install \ |
| 118 | + --prefix=/path/to/ucc-rocm-install |
| 119 | + shell$ make -j && make install |
| 120 | +
|
| 121 | + #Configure and compile Open MPI with UCX, UCC, and ROCm support |
| 122 | + shell$ cd ompi |
| 123 | + shell$ ./configure --with-rocm=/opt/rocm \ |
| 124 | + --with-ucx=/path/to/ucx-rocm-install \ |
| 125 | + --with-ucc=/path/to/ucc-rocm-install |
| 126 | + |
| 127 | +To use the UCC component in an applicatin requires setting some |
| 128 | +additional parameters: |
| 129 | + |
| 130 | +.. code-block:: |
| 131 | +
|
| 132 | + shell$ mpirun --mca pml ucx --mca osc ucx \ |
| 133 | + --mca coll_ucc_enable 1 \ |
| 134 | + --mca coll_ucc_priority 100 -np 64 ./my_mpi_app |
0 commit comments