Closed
Description
❓ Question
I used Torch-TensorRT to compile the torchscript model in C++. When compiling or loading torchtrt model, it displays many warnings.
WARNING: [Torch-TensorRT] - Detected this engine is being instantitated in a multi-GPU system with multi-device safe mode disabled. For more on the implications of this as well as workarounds, see the linked documentation (https://pytorch.org/TensorRT/user_guide/runtime.html#multi-device-safe-mode)
WARNING: [Torch-TensorRT] - Detected this engine is being instantitated in a multi-GPU system with multi-device safe mode disabled. For more on the implications of this as well as workarounds, see the linked documentation (https://pytorch.org/TensorRT/user_guide/runtime.html#multi-device-safe-mode)
WARNING: [Torch-TensorRT] - Detected this engine is being instantitated in a multi-GPU system with multi-device safe mode disabled. For more on the implications of this as well as workarounds, see the linked documentation (https://pytorch.org/TensorRT/user_guide/runtime.html#multi-device-safe-mode)
What you have already tried
I found this link is useful, but it only provides Python API.
I checked the source code, but I still haven't figured out how to set up MULTI_DEVICE_SAFE_MODE in C++.
What can I do to address this warning?
Environment
Build information about Torch-TensorRT can be found by turning on debug messages
- PyTorch Version (e.g., 1.0):
- CPU Architecture: x86
- OS (e.g., Linux): ubuntu18
- How you installed PyTorch (
conda
,pip
,libtorch
, source): libtorch - Build command you used (if compiling from source):
- Are you using local sources or building from archives:
- Python version:
- CUDA version: 12.2
- GPU models and configuration: 1080Ti
- Any other relevant information: