Description
Unclear entirely who is broken - likely a mix of missing wgpu error messages and browser bugs:
According to the WebGL2 spec the constraints on gl.readPixels
are mostly the same as in WebGL 1.
Which states:
Only two combinations of format and type are accepted. The first is format RGBA and type UNSIGNED_BYTE.
This is in contrast to Mozilla's documentation
which explicitly allows GL_RED
for WebGL2 as well as integer variants.
More anecdotally, this SO answer also says it's not portable to read anything but RGBA.
Empirically on my side, I get read out zeros with no warning/error on firefox when trying to read back a wgpu::TextureFormat::R32Float
.
Furthermore, this also seems to mean that all RGBA_INTEGER
from describe_texture_format
in conv.rs
are incorrect for passing them to gl.readPixels
on WebGL.
Turns out this part depends on the browser & format combination (haven't checked for the float case yet):
- rgba32uint: Works in Firefox, Chrome & Safari
- rgba8uint: Works in Chrome & Safari. On Firefox I get something, but it's not integers (haven't digged in)
- I hit this in particular when trying to work around the above f32 issue, by doing a bitcast + bitshifts into
Rgba8Uint
- I hit this in particular when trying to work around the above f32 issue, by doing a bitcast + bitshifts into
(all tested on a Mac, not unlikely to behave different on different OSes)
For anyone else hitting this: My current workaround is to copy the f32 question of desire into an RgbaF32, read that back and then skip through it. It seems this is, while crazy in overhead, the most portable way of doing things, but I still have to do more testing.