Bug 23308 - yuv to rgb color space conversion shader renders in "pink fashion" with Intel 965GM
Summary: yuv to rgb color space conversion shader renders in "pink fashion" with Intel...
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Mesa core (show other bugs)
Version: unspecified
Hardware: x86 (IA32) Linux (All)
: medium major
Assignee: Ian Romanick
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2009-08-14 07:44 UTC by Gilbert Brisset
Modified: 2009-10-19 10:56 UTC (History)
2 users (show)

See Also:
i915 platform:
i915 features:


Attachments
YUV to RGB GLSL shader demo with Glut, gtkglarea, gtkglext and GLX (186.26 KB, application/x-gzip)
2009-08-17 04:27 UTC, Gilbert Brisset
Details

Description Gilbert Brisset 2009-08-14 07:44:10 UTC
I want to render a video stream throug OpenGL. 
YUV to RGB color space conversion is done thanks to a GLSL shader. 
Then I plan to program an edge enhencement shader.
This application is planned to run in an embedded PC board using an Intel 965GM chipset. The application will be developped in GTK and in JAVA.
OpenGL is fully supported on this chipset with the last MESA 7.5 driver.
Unfortunately we have color rendering problems. The image appears pink.
I attach a demo that built a red, green, blue and black YUV 4:2:0 pattern and then render it in an OpenGL context with a color space conversion shader.
The demo has compilation keys that allow to select the rendering framework from GLUT, gtkglext, gtkglarea and native GLX.
Both GLUT and GLX work well on Intel 965GM. gtkglext and gtkglarea render in "pink fashion" on Intel 965 GM.
The four frameworks render with correct colors on an Nvidia Quadro FX 770M chipset but with a closed source driver.
It seems that something has been modified in the GLX api with the last MESA driver and this modification has not been propagated in gtkglarea and gtkglext.

Some preliminary tests, to be enhenced, show that the same bug appears in the LWJGL framework allowing OpenGL in JAVA.

I'm not sure that it is the appropriate place to report this bug. May be best in each framework bugzilla. 
But may be you have an idea.

The attached demo is delivered in a tar file. Just untar, edit the main.c file, uncomment the selected framework define and make. then type ./yuv_rgb_glx_demo.
This demo has been test with 
- a Lenovo R61 laptop with Fedora 11 32 bits,
- a Dell Precision M4400 laptop with Fedora 10 32 bits.

glxinfo files are attached in the tar file.

Thank you.
Comment 1 Gordon Jin 2009-08-17 02:04:50 UTC
attachment?
Comment 2 Gilbert Brisset 2009-08-17 04:27:29 UTC
Created attachment 28702 [details]
YUV to RGB GLSL shader demo with Glut, gtkglarea, gtkglext and GLX

Sorry, I guessed I had really attached it.
Hope it's ok now.
Comment 3 Gilbert Brisset 2009-08-19 01:34:44 UTC
I had run some tests with constant YUV or RGB values in the shader.
The following shader, in the context of the attached demo displays a pure green.

uniform sampler2DRect Ytex;
uniform sampler2DRect Utex;
uniform sampler2DRect Vtex;
uniform float         XFormat ; // 2 if 4:2:x format, 1 if 4:4:4 
uniform float         YFormat ; // 2 if 4:2:0 format, 1 if 4:4:4 or 4:2:2

void main(void) 
{
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0) ; 
}

Now, in the same context, the following shader is supposed to compute a pure green r g b vector from the given y u v vector.
uniform sampler2DRect Ytex;
uniform sampler2DRect Utex;
uniform sampler2DRect Vtex;
uniform float         XFormat ; // 2 if 4:2:x format, 1 if 4:4:4 
uniform float         YFormat ; // 2 if 4:2:0 format, 1 if 4:4:4 or 4:2:2

void main(void) 
{
float r, g, b, y, u, v ;

y = 144.0/256.0 ;
u =  54.0/256.0 ;
v =  34.0/256.0 ;

y  = 1.1643 * (y - 0.0625) ;
u  = u - 0.5 ;
v  = v - 0.5 ;

r  = y + 1.5958  * v ;
g  = y - 0.39173 * u - 0.81290 * v ;
b  = y + 2.017   * u ;

gl_FragColor = vec4(r,g,b,1.0) ; 
}

In the context of the attached demo, this shader gives a green color with the GLX or GLUT frameworks, but a pink color with the gtkgkarea or gtkglext frameworks.
This means that there is a feature, explicit in the demo code, or implicit as a default behaviour, in the OpenGL pipeline that works differently between the frameworks, and that modifies the shader computations.

I continue to investigate.
Comment 4 Gilbert Brisset 2009-08-19 04:34:38 UTC
I had run some tests with constant YUV or RGB values in the shader.
The following shader, in the context of the attached demo displays a pure green.

uniform sampler2DRect Ytex;
uniform sampler2DRect Utex;
uniform sampler2DRect Vtex;
uniform float         XFormat ; // 2 if 4:2:x format, 1 if 4:4:4 
uniform float         YFormat ; // 2 if 4:2:0 format, 1 if 4:4:4 or 4:2:2

void main(void) 
{
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0) ; 
}

Now, in the same context, the following shader is supposed to compute a pure green r g b vector from the given y u v vector.
uniform sampler2DRect Ytex;
uniform sampler2DRect Utex;
uniform sampler2DRect Vtex;
uniform float         XFormat ; // 2 if 4:2:x format, 1 if 4:4:4 
uniform float         YFormat ; // 2 if 4:2:0 format, 1 if 4:4:4 or 4:2:2

void main(void) 
{
float r, g, b, y, u, v ;

y = 144.0/256.0 ;
u =  54.0/256.0 ;
v =  34.0/256.0 ;

y  = 1.1643 * (y - 0.0625) ;
u  = u - 0.5 ;
v  = v - 0.5 ;

r  = y + 1.5958  * v ;
g  = y - 0.39173 * u - 0.81290 * v ;
b  = y + 2.017   * u ;

gl_FragColor = vec4(r,g,b,1.0) ; 
}

In the context of the attached demo, this shader gives a green color with the GLX or GLUT frameworks, but a pink color with the gtkgkarea or gtkglext frameworks.
This means that there is a feature, explicit in the demo code, or implicit as a default behaviour, in the OpenGL pipeline that works differently between the frameworks, and that modifies the shader computations.

I continue to investigate.
Comment 5 Gilbert Brisset 2009-08-19 07:02:42 UTC
I have found a strange bug in the GLSL shaders with gtkglarea and gtkglext and with a 965GM graphic chipset.
the following shader :

void main(void) 
{
float r, g, b ;
r = 0.5 ; g = 0.5 ; b = 0.5 ;
gl_FragColor = vec4(r, g, b, 1.0) ; 
}

displays grey with GLUT and GLX but black with gtkglarea and gtkglext.

The following shader displays grey with all frameworks :

void main(void) 
{
float r, g, b ;
r = 1.0 / 2.0 ; g =  1.0 / 2.0 ; b = 1.0 / 2.0  ;
gl_FragColor = vec4(r, g, b, 1.0) ; 
}

Fun, isn't it !!

So this leads to a walk around to the problem subjet of this bug report.
In the shader given in the attached demo, replace all the color space conversion matrix coefficients constants by fractionnal constants. For exemple replace :
y  = 1.1643 * (y - 0.0625) ;
by
y  = 2.3286 / 2.0 * (y - 0.125 / 2.0 ) ;

And the demo displays a red green blue black pattern, not pink, with all platforms.
BUT, the final computed RGB values depends on the used framework.

It is expected that the shader, fed with the pattern, computes the following r, g, b values :
      R   G   B
blue  0   0   255
green 0   255 0
red   255 0   0
black 0   0   0
We get the following values with GLUT and GLX
      R   G   B
blue  17  63  255
green 32  245 0
red   209 0   0
black 1   0   1
We get the following values with gtkglarea and gtkglext
      R   G   B
blue  26  101 255
green 34  181 0
red   210 0   6
black 17  16  17

Conclusion : 
- there is obviously a bug in way gtkglarea and gtkglext drive the shader compilation.
- there is an extra processing in the OpenGL pipeline which is different between the two kind of frameworks that modify the colors.
Have anybody an idea ? 
Comment 6 Gilbert Brisset 2009-08-21 01:35:46 UTC
void main(void) 
{
float r, g, b ;
r = 5.0e-1 ; g =  5.0e-1 ; b = 5.0e-1  ;
gl_FragColor = vec4(r, g, b, 1.0) ; 
}

Intel 965GM + Mesa 7.5 + GLUT | GLX : displays grey
Intel 965GM + Mesa 7.5 + gtkglarea | Ggtkglext : displays white

void main(void) 
{
float r, g, b ;
r = 0.05e1 ; g = 0.05e1 ; b = 0.05e1  ;
gl_FragColor = vec4(r, g, b, 1.0) ; 
}

Intel 965GM + Mesa 7.5 + GLUT | GLX : displays grey
Intel 965GM + Mesa 7.5 + gtkglarea | Ggtkglext : displays black
Comment 7 Gilbert Brisset 2009-08-21 09:58:39 UTC
Some more results of investigations.
In the YUV to RGB conversion shader, I had suggested to replaces the formulas as following :

replace :
y  = 1.1643 * (y - 0.0625) ;
by
y  = 2.3286 / 2.0 * (y - 0.125 / 2.0 ) ;

This works but give approximatively good colors.
It is better to do the following :

replace :
y  = 1.1643 * (y - 0.0625) ;
by
y  = 1164.3 / 1000.0 * (y - 62.5 / 1000.0 ) ;

and the colors are perfect.

This means that it is a GLSL compilation bug. 
As the final result of the pixel value at the end of the pipeline is probably an 8 bit unsigned integer ranging from 0 to 255, it seems as if the compiler,
first, cast the floating constants to integer, then computes.
1.1643 * y          gives 1 * y
2.3286 / 2.0 * y    gives 2/2 * y (not better)
1164.3 / 1000.0 * y gives 1164 / 1000 * y (pretty good)

But I guessed that the GLSL compiler was in the MESA driver, not in the gtkglext extension.
Any idea ?

Comment 8 Eric Anholt 2009-10-07 14:48:41 UTC
I extracted the tarball, applied:

--- a/main.c
+++ b/main.c
@@ -61,8 +61,8 @@
  *  - the LWJGL framework allowing to use OpenGL with Java.
  *
  */
-#define USES_GLUT_FRAMEWORK
-//#define USES_GTKGLEXT_FRAMEWORK
+//#define USES_GLUT_FRAMEWORK
+#define USES_GTKGLEXT_FRAMEWORK
 //#define USES_GTKGLAREA_FRAMEWORK
 //#define USES_GLX_FRAMEWORK

typed make, ./yuv_rgb_glx_demo, and I see squares of blue, green, red, black, on Mesa 7.4, 7.5, 7.6, and master.  Similarly, the "the following shader is supposed to compute a pure green" shader produces a pure green.

Am I building as you expect for reproducing the bug?
Comment 9 Gilbert Brisset 2009-10-08 01:34:12 UTC
Hello

The bugs appears with 2 conditions :

- use a gtkglarea or gtkglext wrapper.
- use an Intel 965GM graphic chipset

I understand that you meet the first condition.
(#define USES_GTKGLEXT_FRAMEWORK is the only line uncommented)

What is your hardware ?
Comment 10 Peter Christoffersen 2009-10-08 11:07:33 UTC
Hi,

I have a similar problem with some of my code and found this bug.

I have discovered that it may be related to locale setting. The gtkglext version is broken (pink) with my default language settings 'en_DK.UTF8', but fine with 'C'.

Fine:
LANG=C ./yuv_rgb_glx_demo

Pink:
LANG=en_DK.UTF8 ./yuv_rgb_glx_demo

I have been running it on Intel 965G. Without looking at the mesa code I believe it may be a bug in how mesa parses floats in the shader source. When using Danish locale a lot of the standard c functions switch from using '.' in floats and uses ',' instead.

Also ARB shaders seem to suffer from the same bug. I have some code using ARB shader running on Intel 945GME which shows similar behavior.
Comment 11 Ian Romanick 2009-10-08 12:53:35 UTC
(In reply to comment #10)

> I have discovered that it may be related to locale setting. The gtkglext
> version is broken (pink) with my default language settings 'en_DK.UTF8', but
> fine with 'C'.
> 
> Fine:
> LANG=C ./yuv_rgb_glx_demo
> 
> Pink:
> LANG=en_DK.UTF8 ./yuv_rgb_glx_demo
> 
> I have been running it on Intel 965G. Without looking at the mesa code I
> believe it may be a bug in how mesa parses floats in the shader source. When
> using Danish locale a lot of the standard c functions switch from using '.' in
> floats and uses ',' instead.

Of course!  I'll write a quick piglit test, and create _mesa_strtof that ignores the locale setting.

Thanks for tracking down the root cause.
Comment 12 Gilbert Brisset 2009-10-09 09:42:45 UTC
Very great job. Thank you a lot.
The default behaviour of a C ANSI program is to force the setlocale() to "C".
But the gtkinit() primitives forces the setlocale to "" (I guess) then using the system environment variable, fr_FR.utf8 in my case.
The glutInit() primitive does not do that.

A workaround that, the better, should be a coding rule, is to surround the GLSL compilation part of code by :

/* get current locale */
char * CurrentLocale = setlocale(LC_ALL, NULL) ;

/* force "C" locale */
setlocale(LC_ALL, "C") ;

/* compile shader */
...

/* set back the locale */
setlocale(LC_ALL, CurrentLocale) ;

I will report this information to the gtkglext, lwjgl and OpenGl mailing lists.

Regards.
Comment 13 Peter Christoffersen 2009-10-10 09:29:56 UTC
> A workaround that, the better, should be a coding rule, is to surround the GLSL
> compilation part of code by :
> 
> /* get current locale */
> char * CurrentLocale = setlocale(LC_ALL, NULL) ;
> 
> /* force "C" locale */
> setlocale(LC_ALL, "C") ;
> 
> /* compile shader */
> ...
> 
> /* set back the locale */
> setlocale(LC_ALL, CurrentLocale) ;
> 


It's an ok workaround if you know what you are doing, but it can cause problems in multi threaded applications because setlocale is process wide.
Comment 14 Eric Anholt 2009-10-19 10:56:31 UTC
commit 89b31c9619449d5c9b8ebe4e245c2a926e3583e6
Author: Brian Paul <brianp@vmware.com>
Date:   Wed Oct 14 14:19:03 2009 -0600

    mesa: use C locale for _mesa_strtod()
    
    _mesa_strtod() is used for shader/program parsing where the decimal
    point character is always '.'  Use strtod_l() with a "C" locale to
    ensure correct string->double conversion when the actual locale uses
    another character such as ',' for the decimal point.
    
    Fixes bug 24531.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.