This shows various cutouts without performing post-process bleeding, i.e., the cutouts’ data is not modified by Mineways.
To see how bleeding helps, view this model: https://skfb.ly/JPH6
Please note: bleeding is in my opinion the wrong answer; instead, the renderer should be fixed. However, since I can’t fix someone else’s renderer, bleeding gets rid of the black fringing.
Also, some people like the black fringing, as it gives the decal a toon effect. My feeling is that if you want a toon effect, add the black outline yourself. Don’t rely on a rendering artifact (that might get fixed) to get this result.
Here’s my article about this problem: http://www.realtimerendering.com/blog/gpus-prefer-premultiplication/
CC AttributionCreative Commons Attribution
7 comments
Thanks, very informative!
Although our shaders and postprocesses are in linear (usually RGBM encoding or float buffer if needed), the sRGB textures are still filtered as sRGB.
Using the sRGB webgl extension is an easy solution for us, as it requires very little maintenance, but the spec states :
Ideally,
implementations should perform this color conversion on each sample prior
to filtering but implementations are allowed to perform this conversion
after filtering (though this post-filtering approach is inferior to
converting from sRGB prior to filtering).
The surprisingly post-filtering approach is almost useless for us then.
As for cutouts, we also support it (alpha masking) and encountered the mipmap issue.
There is some workarounds (http://the-witness.net/news/2010/09/computing-alpha-mipmaps/), but for we consider it a minor issue and we just select bilinear filtering by default.
@stephomi Here's my article about this problem realtimerendering.com/blog/gpus-prefer-p... - I hope this explains the phenomenon, and what I think I'm seeing.
> However we indeed don't premult the png's, so the filtering texture read is incorrect!
Exactly: bilinear interpolation and mipmapping, both. That said, I wouldn't worry about it _too_ much. The artifact is a black edging around cutouts, but that's a relatively minor artifact, all things considered. For final images it's more important, and something that software renderers should just get right (since they can control the sampling and filtering process fully). From testing, some do not.
Thanks, much clearer!
I know for sure that we do premultiply the pixel before the gpu blending though.
What we basically do in the shader is (assuming the simpler problem case) :
color = sRGBToLinear(127 0 0)
color = shadingStuffs( color )
color *= 127 // premult
pixel = color 127
// gpu blending in premult linear
// then postprocess stuffs and output the final linearToSRGB(pixel)
However we indeed don't premult the png's, so the filtering texture read is incorrect!
Concerning this issue, the things that I think are the most bothersome for us are:
- should I only premult the png that contains the transparency information, or should I premult other texture channels with it as well (for example emissive, specular, global that might be only RGB)?
- some people are packing other stuffs in the alpha channel of a texture sometimes (metalness, glossiness), so should we check if the alpha has been assigned in our transparency channel?
- should we assume the premult is done in linear or in srgb (or depending of the texture colorspace)?
- Should we add a checkbox to let the user choose if he wants to premult it or not ? Should we do it client-side? server-side? both?
These are just random thoughts, but they expose that even simple issue tends to be annoyingly complicated for us :)
Or I should give the even simpler problem you see here:
255 0 0 255 and 0 0 0 0
If you interpolate between the two you get 127 0 0 127. Notice how this result should be interpreted as a premultiplied result. However, most renderers (including this one) output an RGBA that is not premultiplied. This fails, as this interpolated value must be treated as premultiplied to be blended correctly with the background. The easiest fix is to treat all values coming out of pixel shaders as premultiplied; most renderers do not, leading to this error.
I'll blog about this sometime soon and add the link here. Here's the gist:
PNGs are unassociated (not premultiplied). So imagine you have
255 0 0 255 and 0 255 0 5
A red pixel next to a mostly transparent pixel with a bit of green. Say you sample exactly between these two texels. You'd get 127 127 0 130. That's an idiotic result: just a little green almost entirely faded out now is a big contribution. You really want to have things premultiplied before the GPU does any sampling and bilinear interpolation:
255 0 0 255 and 0 5 0 5 (both are premultiplied now)
giving 127 2 0 130, which makes a lot more sense.
Most renderers (including SketchFab's) do not properly premultiply the PNG texture (usually when read in) by its alpha so that the GPU's bilinear interpolation (and mipmapping, for that matter) support works properly. GPUs sample and interpolate. You want to multiply by the alpha before this interpolation occurs. The best way is to premultiply the texture itself so that bilinear interpolation works correctly.
Hmm I'm not sure I get it.
Here the alpha channel is only 0 and 1s, no in-between value, so I don't see how premultiply alpha in the diffuse png would help.
Am I missing something?
Sidenode: If I'm not mistaken, minecraft is using a nearest filtering so we should probably do the same for minecraft models :)