Bugzilla – Bug 37383
incorrect GLSL optimization
Last modified: 2011-07-07 14:21:02 UTC
Created attachment 46917 [details]
I have a problem with a GLSL fragment shader. As you see in the attached dump file, the shader assigns constant zero to dstColor.w after linking, but the expression that is optimized away can (and should) be nonzero.
If I rewrite the shader to assign all components in one step everything works as it should.
I have tested this on MESA r600g, nouveau and softpipe renderers and get the same result on each. NVidia and AMD proprietary drivers compute the correct alpha value.
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD JUNIPER
OpenGL version string: 2.1 Mesa 7.11-devel
OpenGL shading language version string: 1.20
It's also worth nothing that if the line
dstColor.a *= 1.0 - srcColor.a;
is changed to
dstColor.a = 1.0 - srcColor.a;
Mesa also produces correct code. That at least narrows down where things might be going wrong.
Also, would it be possible for you to make a piglit shader-runner test out of this (with an appropriate Signed-off-by)?
Patch posted to mesa-dev mailing list:
Fix on master by the commit below. This commit has been cherry-picked to 7.11 (42cd619) and 7.10 (cb6dd6c).
Author: Ian Romanick <email@example.com>
Date: Mon Jun 27 16:33:13 2011 -0700
glsl: Track initial mask in constant propagation live set
The set of values initially available (before any kills) must be
tracked with each constant in the set. Otherwise the wrong component
can be selected after earlier components have been killed.
NOTE: This is a candidate for the 7.10 and 7.11 branches.
Reviewed-by: Eric Anholt <firstname.lastname@example.org>
Cc: Kenneth Graunke <email@example.com>
Cc: Matthias Bentrup <firstname.lastname@example.org>