I have two radeon (r200) cards, AGP and PCI. DRI works on AGP but not on PCI,
although Xorg.1.log as well as glxgears report DRI being enbaled and fine.
Glxgears runs at ~300 fps (on AGP head it runs at 1500 fps and it's a
Here is relevant part of dmesg:
ACPI: PCI Interrupt Link [APC3] enabled at IRQ 18
ACPI: PCI interrupt 0000:01:08.0[A] -> GSI 18 (level, high) -> IRQ 18
[drm] Initialized radeon 1.15.0 20050208 on minor 0: ATI Technologies Inc RV280
[Radeon 9200 SE]
ACPI: PCI Interrupt Link [APC4] enabled at IRQ 19
ACPI: PCI interrupt 0000:02:00.0[A] -> GSI 19 (level, high) -> IRQ 19
mtrr: 0xd8000000,0x8000000 overlaps existing 0xd8000000,0x4000000
[drm] Initialized radeon 1.15.0 20050208 on minor 1: ATI Technologies Inc Radeon
R200 QM [Radeon 9100]
[drm:drm_ati_pcigart_cleanup] *ERROR* no scatter/gather memory!
[drm:radeon_do_cleanup_cp] *ERROR* failed to cleanup PCI GART!
On the same note, there is a problem with i2c monitor detection. When i2c bus
of second card is scanned a configuartion of the first card is found.
PS. X server contains minor changes, in keyboard handling and VT switching, to
allow two X servers to run simultaneously, otherwise it's an out of the box
xorg-x11-6.8.2 (as delivered by gentoo)
Created attachment 2288 [details]
This log contains no indication of the problem with DRI, it only show failure
to detect secondary monitor (that would be second monitor on the second
do you have the busids specified properly for each device section in your
config? Also, FWIW, I don't think anyone has gotten the DRI working with two or
(In reply to comment #2)
> FWIW, I don't think anyone has gotten the DRI working with two or more cards.
AFAIR the traditional problems would have prevented the DRI from getting enabled
at all on the second card, so at least most of them seem to have been resolved.
(In reply to comment #0)
> I have two radeon (r200) cards, AGP and PCI. DRI works on AGP but not on PCI,
> although Xorg.1.log as well as glxgears report DRI being enbaled and fine.
> Glxgears runs at ~300 fps (on AGP head it runs at 1500 fps and it's a
> slower/older card).
Performance is known to be quite mysteriously low on some (most?) PCI setups.
You can check whether 3D acceleration is actually being used by comparing
(framerate as well as CPU usage, in particular whether the CPU is used by the X
server or the 3D application itself) between running with and without
LIBGL_ALWAYS_INDIRECT=1 and/or setting LIBGL_DEBUG=verbose.
Michel, it seems you are correct.
I ran glxgears in my normal setup to get 308 fps and negigible cpu usage ~1%
then i tried with LIBGL_ALWAYS_INDIRECT=1 to get 277fps and 100% cpu usage.
This can only have one meaning, DRI is enabled and is working, but quite slow.
This also explains why i'm experiencing bug 2852 on both heads every few hours.
Could that GART scatter/gather issue be relevant?
the original bug is fixed, we should be able to run two radeon cards, the PCI
slowness is another bug...