Created attachment 16111 [details]
I'm using the digital output of a Radeon 9250 AGP, with a HDMI cable connected to the TV. However, the displayed picture is translated about 300 pixels to the right, leaving a big black rectangle on the left.
I have tried to set different modelines to see if it changes anything, but it appears that the driver completely ignores any additional modelines.
I'm using Xorg 1.4.1~git20080131-1 (from Debian unstable) for the core server, and the radeon driver is a git snapshot dated 2008-04-17 (from Debian experimental).
Also tried the radeon driver from Debian unstable (6.8.0), same result.
Also tried with a Radeon 7500 (RV200 QW), same result as well.
Created attachment 16112 [details]
I forgot to add that I had to set the "IgnoreEDID" option, because otherwise the X server will select automatically the 1920x540 resolution, which of course the TV doesn't understand. I'll send you a log showing the returned EDID values.
Created attachment 16627 [details]
Xorg logfile with EDID information
Here is the Xorg output without the IgnoreEDID option, using the git snapshot from 2008-05-12 in Debian experimental.
Note that with this version, the situation with IgnoreEDID is getting slightly worse, since I have no output at all.
Created attachment 17552 [details]
I finally managed to work around the issue. It is exactly the same as the one reported with the intel driver: http://lists.freedesktop.org/archives/xorg/2008-February/032801.html
The problems were the following:
* The X server does not understand the EDID output returned by the TV. This is the real issue, and fixing it should make a similar setup work out of the box. It seems that the TV requests a 1080i (1920x1080 interlaced) mode, and the X server interprets it as 1920x540.
* The default modeline that the X server uses for 1920x1080 when no EDID information is available is not suitable at all for such a display. I had to generate another one with very small synchronisation times, and it worked like a charm. I don’t know whether other brands of TVs are as picky about these sync times, but if so, maybe the default modelines for 720p and 1080p should have different characteristics from the modelines used for "monitor" resolutions.
* Added modelines are not used unless you specify a new mode name and set "PreferredMode". I can’t think of any good reason for this; if you specify a modeline for an existing resolution, it should definitely be used instead of the default or the detected one.
* By default, a screen is only associated with the first output of a device, so the added modeline will only be used by this output. Once you think about it, it is very well documented so I understand this is intentional, but it is counterintuitive.
on Jan 19, 2017 at 10:51:20.
(provided by the Example extension).