Summary: | glVertexAttribPointer(id, GL_UNSIGNED_BYTE, GL_FALSE,...) does not work | ||
---|---|---|---|
Product: | Mesa | Reporter: | rodrigo <rodrigorivascosta> |
Component: | Drivers/DRI/nouveau | Assignee: | Nouveau Project <nouveau> |
Status: | RESOLVED FIXED | QA Contact: | |
Severity: | normal | ||
Priority: | medium | CC: | malaperle, masao-takahashi |
Version: | git | ||
Hardware: | x86 (IA32) | ||
OS: | All | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
Typo?
Test case to show the problem with unnormalized attribute pointers of type GL_UNSIGNED_BYTE |
Description
rodrigo
2013-02-28 21:02:30 UTC
Created attachment 75733 [details] [review] Typo? Can you try the attached patch, it assumes there is a typo in the deffs With that said, the nv30-40 vertex format table needs some love/expansion Thank, Emil, for your quick reply! I've just tried this patch... The good news it that it no longer corrupts the render, and no output to the dmesg, either. The bad news is that it still does not work correctly. This is what happens with the patched applied: I'm sending a vertex attribute array with the same byte value for all the vertices, and this value is used to calculate the color of the vertex, so I expect it to render a solid color. The actual output is that most vertices are rendered black (input 0) with just a few of them of random colors, changing with each frame. Weird... If you think it could be useful I can try to extract the code into a small compilable test case... (In reply to comment #2) > Thank, Emil, for your quick reply! > > I've just tried this patch... The good news it that it no longer corrupts > the render, and no output to the dmesg, either. The bad news is that it > still does not work correctly. > > This is what happens with the patched applied: > > I'm sending a vertex attribute array with the same byte value for all the > vertices, and this value is used to calculate the colour of the vertex, so I > expect it to render a solid colour. The actual output is that most vertices > are rendered black (input 0) with just a few of them of random colours, > changing with each frame. > I was afraid of that. It looks like the original define was a bit off, although obviously this is not the correct one either. Someone with an actual nv4x card and better understanding of OpenGL than me should take a look and reverse-engineer the (remaining) format(s) > Weird... > > If you think it could be useful I can try to extract the code into a small > compilable test case... That's always a good idea :) Created attachment 75914 [details]
Test case to show the problem with unnormalized attribute pointers of type GL_UNSIGNED_BYTE
Compile with:
$ gcc -o glbyte -Wall glbyte.c -lglut -lGLEW -lGL
Running with software rendering:
$ LIBGL_ALWAYS_SOFTWARE=1 ./glbyte
draws a beautiful star.
But with hardware rendering:
$ ./glbyte
it draws nothing :(
FTR, removing _(R8_USCALED , U8_USCALED , 1), _(R8G8_USCALED , U8_USCALED , 2), _(R8G8B8_USCALED , U8_USCALED , 3), _(R8G8B8A8_USCALED , U8_USCALED , 4), from nv30_format.c appears to fix the sample program that you had (it falls back to R32G32_FLOAT). Obviously that's probably not the right approach overall... perhaps those should all be UNORM/SNORM instead of UNORM/USCALED... or SNORM/SSCALED... If you're still able to test on that NV4B, could you redo your program to take a vec4 instead of a vec2 (and just feed it 0's), and see if it works then? (Without any mesa changes.) It worked for me on a NV44. Fix now in mesa-git (http://cgit.freedesktop.org/mesa/mesa/commit/?id=14ee790df77c810f187860a8d51096173ff39fcf), pretty sure it's the right thing. Closing this. Thanks for the test case, very useful! |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.