Bug 50450 - OpenGL does not work or works very slowly on Radeon HD3850
Summary: OpenGL does not work or works very slowly on Radeon HD3850
Status: RESOLVED MOVED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/Gallium/r600 (show other bugs)
Version: git
Hardware: Other All
: medium normal
Assignee: Default DRI bug account
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2012-05-29 03:38 UTC by Maxim Koltsov
Modified: 2019-09-18 19:00 UTC (History)
0 users

See Also:
i915 platform:
i915 features:


Attachments
lspci (1.54 KB, text/plain)
2012-05-29 03:38 UTC, Maxim Koltsov
Details
glxinfo (25.02 KB, text/plain)
2012-05-29 03:39 UTC, Maxim Koltsov
Details
kernel config (71.73 KB, text/plain)
2012-05-29 03:39 UTC, Maxim Koltsov
Details
Xorg log (35.69 KB, text/plain)
2012-05-29 05:27 UTC, Maxim Koltsov
Details
dmesg (49.88 KB, text/plain)
2012-05-29 05:27 UTC, Maxim Koltsov
Details
quake4 console log (17.59 KB, text/plain)
2012-05-29 05:28 UTC, Maxim Koltsov
Details

Description Maxim Koltsov 2012-05-29 03:38:06 UTC
Created attachment 62198 [details]
lspci

I have ATI Radeon HD3850 on AGP bus with Pentium-4 PC. I observe very strange behavior of OpenGL applications:
KWin, qudos, doomsday run slowly even not with high quality settings;
Quake4 demo from ID site crashes right after map loading, just when rendering starts i suppose, and X server crashes with it.
I attach my glxinfo, kernel config and lspci. Versions of used software are: mesa, libdrm and xf86-video-ati from git, linux kernel kernel 3.3.5, X.org server 1.12.1.
I think this is bug in mesa or xf86-video-ati related to my video card. Please help me.
Comment 1 Maxim Koltsov 2012-05-29 03:39:36 UTC
Created attachment 62199 [details]
glxinfo
Comment 2 Maxim Koltsov 2012-05-29 03:39:55 UTC
Created attachment 62200 [details]
kernel config
Comment 3 Andy Furniss 2012-05-29 04:57:03 UTC
dmesg output and /var/log/Xorg.0.log would add more info.

If you use mime type text/plain for them it's easier for people to read the contents in their browser.

I've got an AGP 3850, but it's not in a PC at the moment, when I get time I'll put it back in an see if anything has regressed in the last few months.

With AGP it's quite possible that what works for me with the same card will give you problems just because we are not using the same mobo.
Comment 4 Maxim Koltsov 2012-05-29 05:27:08 UTC
Created attachment 62215 [details]
Xorg log
Comment 5 Maxim Koltsov 2012-05-29 05:27:36 UTC
Created attachment 62216 [details]
dmesg
Comment 6 Maxim Koltsov 2012-05-29 05:28:41 UTC
Created attachment 62218 [details]
quake4 console log

I attached Xorg log, dmesg and log of quake4, which crashes like i said.
Sorry for mimetype, i thought bugzilla auto-detect would work.
Comment 7 Andy Furniss 2012-05-29 06:26:59 UTC
(In reply to comment #6)
> Created attachment 62218 [details]
> quake4 console log
> 
> I attached Xorg log, dmesg and log of quake4, which crashes like i said.
> Sorry for mimetype, i thought bugzilla auto-detect would work.

np everyone does that.

Anyway I think I know why quake4 doesn't work: from your log -

/opt/quake4/q4base/paklibGL error: failed to load driver: r600
libGL error: Try again with LIBGL_DEBUG=verbose for more details.
libGL error: failed to load driver: swrast
libGL error: Try again with LIBGL_DEBUG=verbose for more details.

I suspect if you did run -

LIBGL_DEBUG=verbose quake4-demo 

That you would see something like -

.... ./libstdc++.so.6: version `GLIBCXX_3.4.9' not found ....

This is caused by there being libstdc++.so.6 in the quake4 demo dir that is too old for mesa (well llvm to be precise).

Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in  
/opt/quake4/ I guess, to something else which will make q4 use your system libs.
Comment 8 Maxim Koltsov 2012-05-29 06:51:37 UTC
(In reply to comment #7)
> Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in  
> /opt/quake4/ I guess, to something else which will make q4 use your system
> libs.

Thank you, it worked.
But still i can't understand why doomsday runs so slow, I think it must run smoothly on my card. I also get weird screen artefacts in xonotic.
Comment 9 Andy Furniss 2012-05-29 08:07:33 UTC
(In reply to comment #8)
> (In reply to comment #7)
> > Solution is to rename libstdc++.so.6 and libgcc_s.so.1 that are in  
> > /opt/quake4/ I guess, to something else which will make q4 use your system
> > libs.
> 
> Thank you, it worked.
> But still i can't understand why doomsday runs so slow, I think it must run
> smoothly on my card. I also get weird screen artefacts in xonotic.

I haven't tested those with a 3850, IIRC nexuiz was a bit slow - but then that was at 1920x1080 and reducing res/effects could make it playable.

I also never had the complication of a compositing desktop - maybe there is something there to be tweaked - maybe not.

I don't know gentoo, or any distro (using ancient LFS). Are you compiling mesa git or is gentoo - what options are being used?

I always used to test with vsync off, I can see from your xorg log that swapbuffers wait is on. I don't know how/where you would add the xorg.conf option in gentoo but in 

Section "Device"
Option          "SwapbuffersWait" "off"

will disable that.

In addition you need to run the app with the env

vblank_mode=0 

compositing desktop allowing that should disable vsync and gain a few fps

You could avoid using the env everytime by making/editing .drirc in your home dir to look like -

<driconf>
    <device screen="0" driver="dri2">
        <application name="Default">
            <option name="vblank_mode" value="0" />
        </application>
    </device>
</driconf>

Another thing you could check is whether gentoo does anything with gpu power settings.

cat /sys/class/drm/card0/device/power_method
cat /sys/class/drm/card0/device/power_profile

The only difference gpu wise between your logs and what I remember mine to be is that your GTT is 64M. I used to use 256 - I have no idea if that will help anything at all, but you could try in bios settings increasing AGP Aperture size if you wanted to change it.
Comment 10 Michel Dänzer 2012-06-01 02:09:16 UTC
Does passing radeon.agpmode=-1 on the kernel command line help?
Comment 11 Andreas Boll 2012-08-09 14:53:12 UTC
glxinfo in comment 1 shows:
OpenGL renderer string: Gallium 0.4 on AMD RV670
Comment 12 GitLab Migration User 2019-09-18 19:00:18 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/411.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.