Skip to content

🐛 [Bug] Tutorial Using Torch-TensorRT Directly From PyTorch raises KeyError #1688

Closed
@zshn25

Description

@zshn25

Bug Description

Running Tutorial code raises an error

To Reproduce

Steps to reproduce the behavior:

  1. Visit https://pytorch.org/TensorRT/tutorials/use_from_pytorch.html
  2. Run the codeblocks
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[16], line 5
      1 import torch
      2 import torch_tensorrt
      4 spec = {
----> 5     "forward": torch_tensorrt.ts.TensorRTCompileSpec(
      6         {
      7             "inputs": [torch_tensorrt.Input([1, 3, 480, 768])],
      8             "enabled_precisions": {torch.float, torch.half},
      9             "refit": False,
     10             "debug": False,
     11             "device": {
     12                 "device_type": torch_tensorrt.DeviceType.GPU,
     13                 "gpu_id": 0,
     14                 "dla_core": 0,
     15                 "allow_gpu_fallback": True,
     16             },
     17             "capability": torch_tensorrt.EngineCapability.default,
     18             "num_avg_timing_iters": 1,
     19         }
     20     )
     21 }
     22 scripted_model = torch.jit.script(model, input_image_pytorch)
     23 trt_model = torch._C._jit_to_backend("tensorrt", scripted_model, spec)

File ~/miniconda3/envs/inference/lib/python3.8/site-packages/torch_tensorrt/ts/_compile_spec.py:437, in TensorRTCompileSpec(inputs, input_signature, device, disable_tf32, sparse_weights, enabled_precisions, refit, debug, capability, num_avg_timing_iters, workspace_size, dla_sram_size, dla_local_dram_size, dla_global_dram_size, truncate_long_and_double, calibrator)
    380 """Utility to create a formated spec dictionary for using the PyTorch TensorRT backend
    381 
    382 Keyword Args:
   (...)
    415     torch.classes.tensorrt.CompileSpec: List of methods and formated spec objects to be provided to ``torch._C._jit_to_tensorrt``
    416 """
    418 compile_spec = {
    419     "inputs": inputs,
    420     # "input_signature": input_signature,
   (...)
    434     "truncate_long_and_double": truncate_long_and_double,
    435 }
--> 437 parsed_spec = _parse_compile_spec(compile_spec)
    439 backend_spec = torch.classes.tensorrt.CompileSpec()
    441 if input_signature is not None:

File ~/miniconda3/envs/inference/lib/python3.8/site-packages/torch_tensorrt/ts/_compile_spec.py:239, in _parse_compile_spec(compile_spec_)
    232 if len(compile_spec["inputs"]) > 0:
    233     if not all(
    234         [
    235             isinstance(i, torch.Tensor) or isinstance(i, Input)
    236             for i in compile_spec["inputs"]
    237         ]
    238     ):
--> 239         raise KeyError(
    240             "Input specs should be either torch_tensorrt.Input or torch.Tensor, found types: {}".format(
    241                 [type(i) for i in compile_spec["inputs"]]
    242             )
    243         )
    245     inputs = [
    246         Input.from_tensor(i) if isinstance(i, torch.Tensor) else i
    247         for i in compile_spec["inputs"]
    248     ]
    249     info.inputs = [i._to_internal() for i in inputs]

KeyError: "Input specs should be either torch_tensorrt.Input or torch.Tensor, found types: [<class 'str'>, <class 'str'>, <class 'str'>, <class 'str'>, <class 'str'>, <class 'str'>, <class 'str'>]"

Expected behavior

No error

Environment

Build information about Torch-TensorRT can be found by turning on debug messages

  • Torch-TensorRT Version (e.g. 1.0.0):
  • PyTorch Version (e.g. 1.0):
  • CPU Architecture:
  • OS (e.g., Linux):
  • How you installed PyTorch (conda, pip, libtorch, source):
  • Build command you used (if compiling from source):
  • Are you using local sources or building from archives:
  • Python version:
  • CUDA version:
  • GPU models and configuration:
  • Any other relevant information:

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions