-
Notifications
You must be signed in to change notification settings - Fork 52
Description
In addition to the already available implementations of the array api I think it could be interesting to have a lazy / meta implementation of the standard. What I mean is a small, minimal dependency, standalone library, compatible with the array api, that provides inference of the shape and dtype of resulting arrays, without ever initializing the data and executing any flops.
PyTorch already has something like this with the "meta"
device. For example:
import torch
data = torch.ones((1000, 1000, 100), device="meta")
kernel = torch.ones((100, 10), device="meta")
result = torch.matmul(data, kernel)
print(result.shape)
print(result.dtype)
However this misses for example the device handling, as the device is constrained to "meta"
. I presume that dask must have something very similar. Jax also must have something very similar for the jitted computations. However I think it is only exposed to users with a different API via jax.eval_shape()
and not via an "meta" array object.
Similarly to the torch example one would use a hypothetical library lazy_array_api
:
import lazy_array_api as xp
data = xp.ones((1000, 1000, 100), device="cpu")
kernel = xp.ones((100, 10), device="cpu")
result = xp.matmul(data, kernel)
print(result.shape)
print(result.dtype)
The use case I have in mind is mostly debugging, validation and testing of computational intense algorithms ("dry runs"). For now I just wanted to share the idea and bring it up for discussion.