Bug 2881 - Readiness of DRM for 2 radeons
Readiness of DRM for 2 radeons
Status: RESOLVED FIXED
Product: DRI
Classification: Unclassified
Component: DRM/other
XOrg git
x86 (IA32) Linux (All)
: high normal
Assigned To: Default DRI bug account
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2005-04-01 14:05 UTC by Anatoly M. Verkhovsky
Modified: 2006-04-29 00:57 UTC (History)
1 user (show)

See Also:


Attachments
Xorg.1.log (40.98 KB, text/plain)
2005-04-01 14:12 UTC, Anatoly M. Verkhovsky
no flags Details

Note You need to log in before you can comment on or make changes to this bug.
Description Anatoly M. Verkhovsky 2005-04-01 14:05:18 UTC
I have two radeon (r200) cards, AGP and PCI.  DRI works on AGP but not on PCI,
although Xorg.1.log as well as glxgears report DRI being enbaled and fine.  
Glxgears runs at ~300 fps (on AGP head it runs at 1500 fps and it's a
slower/older card).

Here is relevant part of dmesg:

ACPI: PCI Interrupt Link [APC3] enabled at IRQ 18
ACPI: PCI interrupt 0000:01:08.0[A] -> GSI 18 (level, high) -> IRQ 18
[drm] Initialized radeon 1.15.0 20050208 on minor 0: ATI Technologies Inc RV280
[Radeon 9200 SE]
ACPI: PCI Interrupt Link [APC4] enabled at IRQ 19
ACPI: PCI interrupt 0000:02:00.0[A] -> GSI 19 (level, high) -> IRQ 19
mtrr: 0xd8000000,0x8000000 overlaps existing 0xd8000000,0x4000000
[drm] Initialized radeon 1.15.0 20050208 on minor 1: ATI Technologies Inc Radeon
R200 QM [Radeon 9100]
[drm:drm_ati_pcigart_cleanup] *ERROR* no scatter/gather memory!
[drm:radeon_do_cleanup_cp] *ERROR* failed to cleanup PCI GART!

On the same note, there is a problem with i2c monitor detection.  When i2c bus
of second card is scanned a configuartion of the first card is found.

PS. X server contains minor changes, in keyboard handling and VT switching, to
allow two X servers to run simultaneously, otherwise it's an out of the box
xorg-x11-6.8.2 (as delivered by gentoo)
Comment 1 Anatoly M. Verkhovsky 2005-04-01 14:12:49 UTC
Created attachment 2288 [details]
Xorg.1.log

This log contains no indication of the problem with DRI, it only show failure
to detect secondary monitor (that would be second monitor on the second
device).
Comment 2 Alex Deucher 2005-04-02 10:49:16 UTC
do you have the busids specified properly for each device section in your
config?  Also, FWIW, I don't think anyone has gotten the DRI working with two or
more cards.
Comment 3 Michel Dänzer 2005-04-02 15:09:31 UTC
(In reply to comment #2)
> FWIW, I don't think anyone has gotten the DRI working with two or more cards.

AFAIR the traditional problems would have prevented the DRI from getting enabled
at all on the second card, so at least most of them seem to have been resolved.

(In reply to comment #0)
> I have two radeon (r200) cards, AGP and PCI.  DRI works on AGP but not on PCI,
> although Xorg.1.log as well as glxgears report DRI being enbaled and fine.  
> Glxgears runs at ~300 fps (on AGP head it runs at 1500 fps and it's a
> slower/older card).

Performance is known to be quite mysteriously low on some (most?) PCI setups.
You can check whether 3D acceleration is actually being used by comparing
(framerate as well as CPU usage, in particular whether the CPU is used by the X
server or the 3D application itself) between running with and without
LIBGL_ALWAYS_INDIRECT=1 and/or setting LIBGL_DEBUG=verbose.
Comment 4 Anatoly M. Verkhovsky 2005-04-03 13:36:52 UTC
Michel, it seems you are correct.

I ran glxgears in my normal setup to get 308 fps and negigible cpu usage ~1%
then i tried with LIBGL_ALWAYS_INDIRECT=1 to get 277fps and 100% cpu usage.  

This can only have one meaning, DRI is enabled and is working, but quite slow.
This also explains why i'm experiencing bug 2852 on both heads every few hours. 
Could that GART scatter/gather issue be relevant?
Comment 5 Dave Airlie 2006-04-29 17:57:30 UTC
the original bug is fixed, we should be able to run two radeon cards, the PCI
slowness is another bug...