Open
Description
🐛 Describe the bug
When passing a model containing tensors with an unsupported type, the Vulkan lowering code currently errors out. It should ideally include a partitioner constraint to allow these ops to fall back to CPU.
Affected dtypes:
- float64
- int16
- uint16
- uint32
- uint64
Repro:
import torch
from executorch.backends.vulkan.partitioner.vulkan_partitioner import VulkanPartitioner
from executorch.exir import to_edge_transform_and_lower, EdgeCompileConfig, to_edge
from executorch.extension.pybindings.portable_lib import _load_for_executorch_from_buffer
class Model(torch.nn.Module):
def __init__(self):
super().__init__()
def forward(self, x, y):
return x + y
model = Model()
inputs = (
torch.randn(5, 5).to(torch.float64),
torch.randn(5, 5).to(torch.float64),
)
eager_outputs = model(*inputs)
ep = torch.export.export(model.eval(), inputs)
print(ep)
lowered = to_edge_transform_and_lower(
ep,
partitioner=[VulkanPartitioner()],
compile_config=EdgeCompileConfig(_check_ir_validity=False)
).to_executorch()
print(lowered.exported_program())
et_model = _load_for_executorch_from_buffer(lowered.buffer)
et_outputs = et_model([*inputs])[0]
print(et_outputs)
et_outputs - eager_outputs
Output:
File [~/src/executorch/src/executorch/backends/vulkan/serialization/vulkan_graph_builder.py:82](http://localhost:8888/~/src/executorch/src/executorch/backends/vulkan/serialization/vulkan_graph_builder.py#line=81), in VkGraphBuilder.get_vk_datatype(torch_dtype)
80 return vk_graph_schema.VkDataType.INT32
81 else:
---> 82 raise AssertionError(f"Invalid dtype for vulkan_preprocess ({torch_dtype})")
AssertionError: Invalid dtype for vulkan_preprocess (torch.float64)
Versions
executorch commit 67b6009 (Jun 14)
Metadata
Metadata
Assignees
Labels
Type
Projects
Milestone
Relationships
Development
No branches or pull requests
Activity