Bug 61635 - glVertexAttribPointer(id, GL_UNSIGNED_BYTE, GL_FALSE,...) does not work
Summary: glVertexAttribPointer(id, GL_UNSIGNED_BYTE, GL_FALSE,...) does not work
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/DRI/nouveau (show other bugs)
Version: git
Hardware: x86 (IA32) All
: medium normal
Assignee: Nouveau Project
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2013-02-28 21:02 UTC by rodrigo
Modified: 2013-08-16 00:33 UTC (History)
2 users (show)

See Also:
i915 platform:
i915 features:


Attachments
Typo? (720 bytes, patch)
2013-03-01 10:45 UTC, Emil Velikov
Details | Splinter Review
Test case to show the problem with unnormalized attribute pointers of type GL_UNSIGNED_BYTE (2.05 KB, text/plain)
2013-03-04 19:54 UTC, rodrigo
Details

Description rodrigo 2013-02-28 21:02:30 UTC
I have a  NVidia GeForce 7600GT with the latest git nouveau driver (Gallium 0.4 on NV4B) and kernel 3.7.9.

I found the following problem: calling the function glVertexAttribPointer(idx, GL_UNSIGNED_BYTE, GL_FALSE,...) does not work, that is the previous attrib pointer remains, and the nouveau driver outputs in the dmesg:

[ 2052.630628] nouveau  [  PGRAPH][0000:01:00.0]  ERROR nsource: DATA_ERROR nstatus: BAD_ARGUMENT
[ 2052.630644] nouveau E[  PGRAPH][0000:01:00.0] ch 3 [0x000dd000] subc 7 class 0x4097 mthd 0x1744 data 0x00001417

But glGetError() returns nothing unusual.

If I change the parameter from GL_UNSIGNED_BYTE to any other type (GL_BYTE, GL_UNSIGNED_SHORT, GL_SHORT...) then it works as expected.

If I change the "normalized" parameter to GL_TRUE, it also works fine, even with type GL_UNSIGNED_BYTE.

The software renderer does not have this problem. That and the kernel messages suggest that it is a nouveau issue.
Comment 1 Emil Velikov 2013-03-01 10:45:05 UTC
Created attachment 75733 [details] [review]
Typo?

Can you try the attached patch, it assumes there is a typo in the deffs

With that said, the nv30-40 vertex format table needs some love/expansion
Comment 2 rodrigo 2013-03-01 20:27:08 UTC
Thank, Emil, for your quick reply!

I've just tried this patch... The good news it that it no longer corrupts the render, and no output to the dmesg, either. The bad news is that it still does not work correctly.

This is what happens with the patched applied:

I'm sending a vertex attribute array with the same byte value for all the vertices, and this value is used to calculate the color of the vertex, so I expect it to render a solid color. The actual output is that most vertices are rendered black (input 0) with just a few of them of random colors, changing with each frame.

Weird...

If you think it could be useful I can try to extract the code into a small compilable test case...
Comment 3 Emil Velikov 2013-03-02 20:15:51 UTC
(In reply to comment #2)
> Thank, Emil, for your quick reply!
> 
> I've just tried this patch... The good news it that it no longer corrupts
> the render, and no output to the dmesg, either. The bad news is that it
> still does not work correctly.
> 
> This is what happens with the patched applied:
> 
> I'm sending a vertex attribute array with the same byte value for all the
> vertices, and this value is used to calculate the colour of the vertex, so I
> expect it to render a solid colour. The actual output is that most vertices
> are rendered black (input 0) with just a few of them of random colours,
> changing with each frame.
> 

I was afraid of that. It looks like the original define was a bit off, although obviously this is not the correct one either.

Someone with an actual nv4x card and better understanding of OpenGL than me should take a look and reverse-engineer the (remaining) format(s)


> Weird...
> 
> If you think it could be useful I can try to extract the code into a small
> compilable test case...

That's always a good idea :)
Comment 4 rodrigo 2013-03-04 19:54:59 UTC
Created attachment 75914 [details]
Test case to show the problem with unnormalized attribute pointers of type GL_UNSIGNED_BYTE

Compile with:

$ gcc -o glbyte -Wall glbyte.c -lglut -lGLEW -lGL

Running with software rendering:

$ LIBGL_ALWAYS_SOFTWARE=1 ./glbyte

draws a beautiful star.

But with hardware rendering:

$ ./glbyte

it draws nothing :(
Comment 5 Ilia Mirkin 2013-08-13 17:49:28 UTC
FTR, removing

   _(R8_USCALED          , U8_USCALED , 1),
   _(R8G8_USCALED        , U8_USCALED , 2),
   _(R8G8B8_USCALED      , U8_USCALED , 3),
   _(R8G8B8A8_USCALED    , U8_USCALED , 4),

from nv30_format.c appears to fix the sample program that you had (it falls back to R32G32_FLOAT). Obviously that's probably not the right approach overall... perhaps those should all be UNORM/SNORM instead of UNORM/USCALED... or SNORM/SSCALED...
Comment 6 Ilia Mirkin 2013-08-14 05:21:18 UTC
If you're still able to test on that NV4B, could you redo your program to take a vec4 instead of a vec2 (and just feed it 0's), and see if it works then? (Without any mesa changes.) It worked for me on a NV44.
Comment 7 Ilia Mirkin 2013-08-16 00:33:08 UTC
Fix now in mesa-git (http://cgit.freedesktop.org/mesa/mesa/commit/?id=14ee790df77c810f187860a8d51096173ff39fcf), pretty sure it's the right thing. Closing this. Thanks for the test case, very useful!


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.