I have a NVidia GeForce 7600GT with the latest git nouveau driver (Gallium 0.4 on NV4B) and kernel 3.7.9.
I found the following problem: calling the function glVertexAttribPointer(idx, GL_UNSIGNED_BYTE, GL_FALSE,...) does not work, that is the previous attrib pointer remains, and the nouveau driver outputs in the dmesg:
[ 2052.630628] nouveau [ PGRAPH][0000:01:00.0] ERROR nsource: DATA_ERROR nstatus: BAD_ARGUMENT
[ 2052.630644] nouveau E[ PGRAPH][0000:01:00.0] ch 3 [0x000dd000] subc 7 class 0x4097 mthd 0x1744 data 0x00001417
But glGetError() returns nothing unusual.
If I change the parameter from GL_UNSIGNED_BYTE to any other type (GL_BYTE, GL_UNSIGNED_SHORT, GL_SHORT...) then it works as expected.
If I change the "normalized" parameter to GL_TRUE, it also works fine, even with type GL_UNSIGNED_BYTE.
The software renderer does not have this problem. That and the kernel messages suggest that it is a nouveau issue.
Created attachment 75733 [details] [review]
Can you try the attached patch, it assumes there is a typo in the deffs
With that said, the nv30-40 vertex format table needs some love/expansion
Thank, Emil, for your quick reply!
I've just tried this patch... The good news it that it no longer corrupts the render, and no output to the dmesg, either. The bad news is that it still does not work correctly.
This is what happens with the patched applied:
I'm sending a vertex attribute array with the same byte value for all the vertices, and this value is used to calculate the color of the vertex, so I expect it to render a solid color. The actual output is that most vertices are rendered black (input 0) with just a few of them of random colors, changing with each frame.
If you think it could be useful I can try to extract the code into a small compilable test case...
(In reply to comment #2)
> Thank, Emil, for your quick reply!
> I've just tried this patch... The good news it that it no longer corrupts
> the render, and no output to the dmesg, either. The bad news is that it
> still does not work correctly.
> This is what happens with the patched applied:
> I'm sending a vertex attribute array with the same byte value for all the
> vertices, and this value is used to calculate the colour of the vertex, so I
> expect it to render a solid colour. The actual output is that most vertices
> are rendered black (input 0) with just a few of them of random colours,
> changing with each frame.
I was afraid of that. It looks like the original define was a bit off, although obviously this is not the correct one either.
Someone with an actual nv4x card and better understanding of OpenGL than me should take a look and reverse-engineer the (remaining) format(s)
> If you think it could be useful I can try to extract the code into a small
> compilable test case...
That's always a good idea :)
Created attachment 75914 [details]
Test case to show the problem with unnormalized attribute pointers of type GL_UNSIGNED_BYTE
$ gcc -o glbyte -Wall glbyte.c -lglut -lGLEW -lGL
Running with software rendering:
$ LIBGL_ALWAYS_SOFTWARE=1 ./glbyte
draws a beautiful star.
But with hardware rendering:
it draws nothing :(
_(R8_USCALED , U8_USCALED , 1),
_(R8G8_USCALED , U8_USCALED , 2),
_(R8G8B8_USCALED , U8_USCALED , 3),
_(R8G8B8A8_USCALED , U8_USCALED , 4),
from nv30_format.c appears to fix the sample program that you had (it falls back to R32G32_FLOAT). Obviously that's probably not the right approach overall... perhaps those should all be UNORM/SNORM instead of UNORM/USCALED... or SNORM/SSCALED...
If you're still able to test on that NV4B, could you redo your program to take a vec4 instead of a vec2 (and just feed it 0's), and see if it works then? (Without any mesa changes.) It worked for me on a NV44.
Fix now in mesa-git (http://cgit.freedesktop.org/mesa/mesa/commit/?id=14ee790df77c810f187860a8d51096173ff39fcf), pretty sure it's the right thing. Closing this. Thanks for the test case, very useful!