Summary: | [Radeonsi/Hawaii] Rendering errors when running basic opengl 3 demo | ||
---|---|---|---|
Product: | Mesa | Reporter: | Sebastian Parborg <darkdefende> |
Component: | Mesa core | Assignee: | mesa-dev |
Status: | RESOLVED NOTOURBUG | QA Contact: | mesa-dev |
Severity: | normal | ||
Priority: | medium | CC: | darkdefende |
Version: | git | ||
Hardware: | x86-64 (AMD64) | ||
OS: | Linux (All) | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: |
The lab files.
Bunny render errors glxinfo output The bunny on fglrx and/or the binary nvidia drivers s/MicroGlut/freeglut/ |
Created attachment 113409 [details]
Bunny render errors
Created attachment 113410 [details]
glxinfo output
FWIW just tested on nvc0, llvmpipe, and softpipe -- all give the same results. At the very least this is a gallium issues, if not a mesa one. (Classic swrast doesn't do GL3.2, don't have i965 handy.) It looks like lab1-1.c source does not match provided binary lab1-1. If I compile sources again I get segfault at _mesa_UniformMatrix4fv. (In reply to Tapani Pälli from comment #4) > It looks like lab1-1.c source does not match provided binary lab1-1. If I > compile sources again I get segfault at _mesa_UniformMatrix4fv. The binary should be representative of what is in the .c file. I forgot to remove the binary when I created the archive. I do not get a segfault when recompiling. It work just like before :S (In reply to Sebastian Parborg from comment #5) > (In reply to Tapani Pälli from comment #4) > > It looks like lab1-1.c source does not match provided binary lab1-1. If I > > compile sources again I get segfault at _mesa_UniformMatrix4fv. > > The binary should be representative of what is in the .c file. I forgot to > remove the binary when I created the archive. > > I do not get a segfault when recompiling. It work just like before :S The way how you pass matrix does not seem legit with C standard. Which compiler and version are you using? Are you using C or C++ compiler? I got it compiling and working when using '--std=c99' but even then the visual result of the app is different than what the prebuilt binary provides. For me the bunny is completely solid black (on i965 driver). (In reply to Tapani Pälli from comment #6) > > The way how you pass matrix does not seem legit with C standard. Which > compiler and version are you using? Are you using C or C++ compiler? I'm using gcc 4.9. And gcc (not g++) as written in the makefile. You you running the latest git version of the intel drivers/mesa? Disregard the last question. You have to use the supplied makefile! The bunny is black for you because you didn't use the "-DGL_GLEXT_PROTOTYPES" compile flag. (In reply to Sebastian Parborg from comment #9) > Disregard the last question. You have to use the supplied makefile! > > The bunny is black for you because you didn't use the > "-DGL_GLEXT_PROTOTYPES" compile flag. Yep true that's was the case, the result is same now. Created attachment 113518 [details]
The bunny on fglrx and/or the binary nvidia drivers
This is how it look like when using the nvidia binary drivers or fglrx.
This seems to be related to the context creation by the included "MicroGlut". The code of this is quite messy. My guess is that something is wrong with context creation, not Mesa. If I replace "MicroGlut" with freeglut, it works as expected. Created attachment 113522 [details]
s/MicroGlut/freeglut/
(In reply to Grigori Goronzy from comment #12) > This seems to be related to the context creation by the included > "MicroGlut". The code of this is quite messy. My guess is that something is > wrong with context creation, not Mesa. > > If I replace "MicroGlut" with freeglut, it works as expected. Thanks for posting a workaround/fix! I'll try it out soon. ( Now i don't have to reboot to fglrx to work on the labs, thanks to you :) ) My professors "MicroGlut" might be the cause of this problem, I agree. The question is what it's doing wrong though. I mean, it seems to work fine on OSX and with the proprietary drivers from nvidia and amd on Linux. So if it renders correctly with all of those then what is it doing wrong? Do all those drivers interpret faulty code the "right" way? To me it seems quite unlikely that all those drivers would interpret the faulty code the exact same way so it doesn't break on any of them. The call to glXChooseFBConfig in MicroGlut doesn't specify any attribute list, so a list of all configs is returned by it. Later on, the GLX context i just uses the first returned config, which can be pretty much anything. When I specify a attribute list to get a suitable FB config, it works fine. Other drivers might get away with it because they return the FB configs in a different order, but obviously you can't count on this. (In reply to Grigori Goronzy from comment #15) > The call to glXChooseFBConfig in MicroGlut doesn't specify any attribute > list, so a list of all configs is returned by it. Later on, the GLX context > i just uses the first returned config, which can be pretty much anything. > When I specify a attribute list to get a suitable FB config, it works fine. > > Other drivers might get away with it because they return the FB configs in a > different order, but obviously you can't count on this. Ah, that makes sense. It works for me now too :) Thank you (and everyone else) for your help! I will inform my professor and hope that he includes the changes so other students doesn't have to run into the same problems as I did. I guess this bug can be closed then? As you said, the order of the returned configs doesn't have to be in any specific order. So I guess the other drivers just got lucky. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.
Created attachment 113408 [details] The lab files. I'm currently taking a course in Opengl 3 graphics programming. When doing the first lab I noticed that the bunny (and other objects) had quite severe rendering errors when using the open source driver for my graphics card. See the attached image for how it looks. There seems to be a problem with the facing direction of the rendered triangles. However, it doesn't do that much of a difference if I turn off culling either... The bunny renders correctly on nvidia cards (with the closed source drivers) and on my card running fgrlx. The professor in charge of the course is running OSX, and the lab computers are running Linux (with nvidia cards). So I guess this is a problem only with Mesa (or something else in the driver stack). I haven't tried on intel cards so I can't say if this is just a problem with radeonsi or not. I've attached the lab code. It should run on Linux and OSX. I would really appreciate if other people could try it out so we can see if this is a general problem with mesa or not. I haven't seen anything like this in other opengl 3 programs. So if there is a problem with the code, IE it breaks opengl spec, then please tell me so I can inform the professor about it.