Description
I'm trying to implement a deferred renderer. As part of this, I figured I'd batch the lights by type, for efficiency, with one instance per light.
However, as some lights have shadows and some don't, this triggers a warning about non-uniform control flow and texture sampling. The light index is @interpolate(flat) but this is still apparently considered non-uniform. Nevertheless, the shader works fine on desktop (macos). It just throws an ugly warning in the console that I can't suppress. But I suspect it's there because tiled GPUs might not work.
I'm however at a loss for how to solve this efficiently. Without push constants, the only way I can pass dynamic data to my shader is via a buffer. To avoid creating a unique buffer per draw call, I can use dynamic buffer offsets. But, buffer bindings must be aligned to 256 bytes.
This means that not only do I have to split a perfectly batchable call into one call per instance, but, in order to pass a mere uint32 to a light, so it can index from a shared light buffer, I need to allocate 256 bytes per draw call. This seems ridiculous.
At first, I thought maybe I could use firstInstanceIndex as a push constant, but this would also have to be passed from vert to frag, and thus be non-uniform. It is also an optional feature so I'm not even sure it's commonly available.
TLDR:
- are flat interpolated varyings supposed to be non uniform?
- is there some other way of having push constants that I've missed?
- wtf is the point of throwing warnings about shaders that work fine?
- why does texturing suck so much in WebGPU? not having bindless is one thing but not even being able to put an if around a texture fetch [sucks] and makes this a sad repeat of WebGL