Bug 15657 - Incorrect EDID parsing for 1080i and other modeline related issues
Incorrect EDID parsing for 1080i and other modeline related issues
Status: NEW
Product: xorg
Classification: Unclassified
Component: Server/DDX/Xorg
git
x86 (IA32) Linux (All)
: medium normal
Assigned To: Xorg Project Team
Xorg Project Team
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2008-04-22 14:01 UTC by Josselin Mouette
Modified: 2008-07-07 00:58 UTC (History)
0 users

See Also:


Attachments
xorg.conf (1.70 KB, text/plain)
2008-04-22 14:01 UTC, Josselin Mouette
no flags Details
Xorg logfile (41.48 KB, text/plain)
2008-04-22 14:01 UTC, Josselin Mouette
no flags Details
Xorg logfile with EDID information (50.17 KB, text/plain)
2008-05-19 10:14 UTC, Josselin Mouette
no flags Details
Working configuration (1.96 KB, text/plain)
2008-07-06 04:41 UTC, Josselin Mouette
no flags Details

Note You need to log in before you can comment on or make changes to this bug.
Description Josselin Mouette 2008-04-22 14:01:07 UTC
Created attachment 16111 [details]
xorg.conf

I'm using the digital output of a Radeon 9250 AGP, with a HDMI cable connected to the TV. However, the displayed picture is translated about 300 pixels to the right, leaving a big black rectangle on the left.

I have tried to set different modelines to see if it changes anything, but it appears that the driver completely ignores any additional modelines.

I'm using Xorg 1.4.1~git20080131-1 (from Debian unstable) for the core server, and the radeon driver is a git snapshot dated 2008-04-17 (from Debian experimental).

Also tried the radeon driver from Debian unstable (6.8.0), same result.

Also tried with a Radeon 7500 (RV200 QW), same result as well.
Comment 1 Josselin Mouette 2008-04-22 14:01:36 UTC
Created attachment 16112 [details]
Xorg logfile
Comment 2 Josselin Mouette 2008-04-23 03:26:50 UTC
I forgot to add that I had to set the "IgnoreEDID" option, because otherwise the X server will select automatically the 1920x540 resolution, which of course the TV doesn't understand. I'll send you a log showing the returned EDID values.
Comment 3 Josselin Mouette 2008-05-19 10:14:26 UTC
Created attachment 16627 [details]
Xorg logfile with EDID information

Here is the Xorg output without the IgnoreEDID option, using the git snapshot from 2008-05-12 in Debian experimental.

Note that with this version, the situation with IgnoreEDID is getting slightly worse, since I have no output at all.
Comment 4 Josselin Mouette 2008-07-06 04:41:25 UTC
Created attachment 17552 [details]
Working configuration

I finally managed to work around the issue. It is exactly the same as the one reported with the intel driver: http://lists.freedesktop.org/archives/xorg/2008-February/032801.html

The problems were the following:
 * The X server does not understand the EDID output returned by the TV. This is the real issue, and fixing it should make a similar setup work out of the box. It seems that the TV requests a 1080i (1920x1080 interlaced) mode, and the X server interprets it as 1920x540. 
 * The default modeline that the X server uses for 1920x1080 when no EDID information is available is not suitable at all for such a display. I had to generate another one with very small synchronisation times, and it worked like a charm. I don’t know whether other brands of TVs are as picky about these sync times, but if so, maybe the default modelines for 720p and 1080p should have different characteristics from the modelines used for "monitor" resolutions.
 * Added modelines are not used unless you specify a new mode name and set "PreferredMode". I can’t think of any good reason for this; if you specify a modeline for an existing resolution, it should definitely be used instead of the default or the detected one.
 * By default, a screen is only associated with the first output of a device, so the added modeline will only be used by this output. Once you think about it, it is very well documented so I understand this is intentional, but it is counterintuitive.