Created attachment 62198 [details] lspci I have ATI Radeon HD3850 on AGP bus with Pentium-4 PC. I observe very strange behavior of OpenGL applications: KWin, qudos, doomsday run slowly even not with high quality settings; Quake4 demo from ID site crashes right after map loading, just when rendering starts i suppose, and X server crashes with it. I attach my glxinfo, kernel config and lspci. Versions of used software are: mesa, libdrm and xf86-video-ati from git, linux kernel kernel 3.3.5, X.org server 1.12.1. I think this is bug in mesa or xf86-video-ati related to my video card. Please help me.
Created attachment 62199 [details] glxinfo
Created attachment 62200 [details] kernel config
dmesg output and /var/log/Xorg.0.log would add more info. If you use mime type text/plain for them it's easier for people to read the contents in their browser. I've got an AGP 3850, but it's not in a PC at the moment, when I get time I'll put it back in an see if anything has regressed in the last few months. With AGP it's quite possible that what works for me with the same card will give you problems just because we are not using the same mobo.
Created attachment 62215 [details] Xorg log
Created attachment 62216 [details] dmesg
Created attachment 62218 [details] quake4 console log I attached Xorg log, dmesg and log of quake4, which crashes like i said. Sorry for mimetype, i thought bugzilla auto-detect would work.
(In reply to comment #6) > Created attachment 62218 [details] > quake4 console log > > I attached Xorg log, dmesg and log of quake4, which crashes like i said. > Sorry for mimetype, i thought bugzilla auto-detect would work. np everyone does that. Anyway I think I know why quake4 doesn't work: from your log - /opt/quake4/q4base/paklibGL error: failed to load driver: r600 libGL error: Try again with LIBGL_DEBUG=verbose for more details. libGL error: failed to load driver: swrast libGL error: Try again with LIBGL_DEBUG=verbose for more details. I suspect if you did run - LIBGL_DEBUG=verbose quake4-demo That you would see something like - .... ./libstdc++.so.6: version `GLIBCXX_3.4.9' not found .... This is caused by there being libstdc++.so.6 in the quake4 demo dir that is too old for mesa (well llvm to be precise). Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in /opt/quake4/ I guess, to something else which will make q4 use your system libs.
(In reply to comment #7) > Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in > /opt/quake4/ I guess, to something else which will make q4 use your system > libs. Thank you, it worked. But still i can't understand why doomsday runs so slow, I think it must run smoothly on my card. I also get weird screen artefacts in xonotic.
(In reply to comment #8) > (In reply to comment #7) > > Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in > > /opt/quake4/ I guess, to something else which will make q4 use your system > > libs. > > Thank you, it worked. > But still i can't understand why doomsday runs so slow, I think it must run > smoothly on my card. I also get weird screen artefacts in xonotic. I haven't tested those with a 3850, IIRC nexuiz was a bit slow - but then that was at 1920x1080 and reducing res/effects could make it playable. I also never had the complication of a compositing desktop - maybe there is something there to be tweaked - maybe not. I don't know gentoo, or any distro (using ancient LFS). Are you compiling mesa git or is gentoo - what options are being used? I always used to test with vsync off, I can see from your xorg log that swapbuffers wait is on. I don't know how/where you would add the xorg.conf option in gentoo but in Section "Device" Option "SwapbuffersWait" "off" will disable that. In addition you need to run the app with the env vblank_mode=0 compositing desktop allowing that should disable vsync and gain a few fps You could avoid using the env everytime by making/editing .drirc in your home dir to look like - <driconf> <device screen="0" driver="dri2"> <application name="Default"> <option name="vblank_mode" value="0" /> </application> </device> </driconf> Another thing you could check is whether gentoo does anything with gpu power settings. cat /sys/class/drm/card0/device/power_method cat /sys/class/drm/card0/device/power_profile The only difference gpu wise between your logs and what I remember mine to be is that your GTT is 64M. I used to use 256 - I have no idea if that will help anything at all, but you could try in bios settings increasing AGP Aperture size if you wanted to change it.
Does passing radeon.agpmode=-1 on the kernel command line help?
glxinfo in comment 1 shows: OpenGL renderer string: Gallium 0.4 on AMD RV670
-- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/411.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.