I've just bought an Asus-branded Radeon 9200SE 128MiB RGB+DVI card; manufacturer's code A9200SE/TD/P/128M/A, part number 90-C1VBG6-GUAY1Z, serial number 47CM1I3257, which I'm using with a Dell 2001FP LCD monitor. I'm having some trouble getting it to work with the Xorg X server (using the "radeon" driver) on my Duron-based PC running Linux 2.6.8.1 with the radeon DRM module (the one that came with Linux 2.6.8.1) built in. I've tried both the Xorg 6.7.0 release and the CVS tag XORG-6_7_99_902; both behave the same way. The logs below are from the CVS version. The first problem is that DVI output doesn't work at all in X: while I can view the text-mode console fine through the monitor's DVI input, once I start X the screen goes blank and switches off (indicating no input). RGB output, however, works beautifully in both text mode and X; the picture quality's very nice. I've tried starting X with the RGB cable disconnected; I've tried rebooting the machine with the RGB cable disconnected; I've tried various values for the "MonitorLayout" option. Nothing seems to produce a picture on the DVI output. The second problem is that XVideo overlays don't appear to be visible. (I'm using MPlayer and VLC to test this.) The player appears to be playing correctly, but rather than the video I just get a plain dark blue window. Since I can only use the RGB output at the moment, I guessed that it might be sending the scaler output to the wrong CRTC, so I tried the "OverlayOnCRTC2" option, but it didn't make a difference. (Using OpenGL output in MPlayer works, but it's too slow for comfortable viewing; in general, OpenGL stuff works, but it seems much slower than my old Matrox G400 was. I haven't experimented with this very much, but I thought it might be worth mentioning...) I'll attach the output of "lspci -v", my xorg.conf, and /var/log/Xorg.0.log from two different test runs: the first being the first time I start Xorg after a cold boot with just the DVI cable attached, and the second being when I start it again after connecting the RGB cable too. If there's anything you'd like me to try, or any more information you'd like me to provide, please get in touch. Thanks!
Created attachment 671 [details] Output of "lspci -v" as root
Created attachment 672 [details] Minimal xorg.conf
Created attachment 673 [details] Xorg.0.log after cold boot with RGB disconnected
Created attachment 674 [details] Xorg.0.log after attaching RGB cable
Created attachment 690 [details] Tool to set XV_SWITCHCRT to 1 After some experimentation, I was able to get XVideo output working on the RGB output by writing this little program, which sets the XV_SWITCHCRT attribute to 1. XV_SWITCHCRT was left at 0 even with OverlayOnCRTC2 enabled in xorg.conf. Perhaps it should default to 1 if that option is specified?
try the "monitorlayout" option to force the output types. see the radeon man page. the overlayoncrt2 option has been removed and replaced with the Xv attribute. the driver should detect which output is active and associate it with crtc1 if you are using one monitor. the Xv attribute is really only for switching the overlay when clone mode in effect.
I've tried the following with only the DVI output connected: Option "MonitorLayout" "TMDS, NONE" Option "MonitorLayout" "TMDS, CRT" Option "MonitorLayout" "TMDS, TMDS" The monitor stays turned on, but the screen is completely black. Option "MonitorLayout" "NONE, TMDS" The monitor turns off. Option "MonitorLayout" "LVDS, NONE" X spins on startup, necessitating a reboot to make the machine usable again. I guessed that OverlayOnCRTC2 had been retired, but it's still listed in the radeon manpage, and it's clearly detecting the wrong output on my system.
Just to let you know that at least one other is having this problem (no DVI output - can't verify invis. XVideo), with several generic 128M radeon 9200 (non-SE) cards, since Xorg 6.8 at least. My initial workaround was to use UseFBDev, but that disables some of the recent DRI enhancements, so I've reverted to using analog D-SUB exclusively. I tried comparing the kernel's initialization with the X server's, but saw no significant differences. One thing I did notice, though, is that disabling the X analog D-SUB output by either detaching the connector at X startup or using MonitorLayout, you still get some sort of funky video display on the CRT output (like maybe what's supposed to be displaying on the DVI output). Of course MonitorLayout seems to have its own little quirks, notably that DDC info seems to get screwed up when using it. Note that I use the same monitor (HP L2335) for both outputs.
Aha! Having discovered by accident that the DVI output did work for lower resolutions, I found this page: http://suif.stanford.edu/~csapuntz/rv280-linux-dvi.html ... which suggests that it's a problem with either the chipset or the monitor not being able to handle the default 1600x1200 modeline, and gives an alternative to use (which works fine for me in 6.8.2 without the given patch). This solves all the problems I was having, so I'd be happy to see this resolved as FIXED/NOTABUG. I'm not sure if it'd be possible to detect an LCD being used with an rv280 and use the alternate modelines automatically -- it sounds like that'd be hard to make bulletproof -- but it might be worth documenting this in the radeon man page. Thanks to all the people who've commented on this bug; hopefully the same fix will work for others.
*** Bug 945 has been marked as a duplicate of this bug. ***
*** Bug 1724 has been marked as a duplicate of this bug. ***
*** Bug 4193 has been marked as a duplicate of this bug. ***
*** Bug 3627 has been marked as a duplicate of this bug. ***
Please open a separate bug report to track the xv problem, if it persists.
*** This bug has been marked as a duplicate of 5386 ***
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.