The xorg-video-intel driver does not fully handle mode selection through an SDVO ADD2->DVI adaptor. Specifically it doesn't cope with interlaced displays. This prevents full use of any HDTV with a native resultion of 1080i (1920x1080 50/60Hz interlaced) with an Intel GMA based *nix PC, over DVI or HDMI.
The work is on progress on the mailing list..
Reassign Hong's bugs back to the default for triage.
Ling will work on interlaced mode around Q1 2009.
Wow.. I would love to see this fixed. This bug is not widely published and I happened to find it here only after a lot of digging, and many hours spent trying to figure out why interlaced modes such as 1080i weren't working.
I'm willing to provide assistance here as far as testing and debug. I don't have any knowledge of the intel video driver directly, though.
*** Bug 21391 has been marked as a duplicate of this bug. ***
*** Bug 21901 has been marked as a duplicate of this bug. ***
*** Bug 21905 has been marked as a duplicate of this bug. ***
(In reply to comment #5) > I'm willing to provide assistance here as far as testing and debug. I don't > have any knowledge of the intel video driver directly, though. Sorry for the late response. Now the interlace is added in the latest Eric's drm-intel-next tree. Will you please help to test whether it can work for you? The drm-intel-next tree can be obtained by using the following command: 1. git clone git://git.kernel.org/pub/scm/linux/kernel/git/anholt/drm-intel.git 2. git branch -r 3. git checkout -b origin/drm-intel-next thanks. Yakui
I'm no longer in a position to be able to test this (hardware no longer available to me).. hopefully you can find someone else to test it.
I can test myself as I still have the issue but I will need some support (as I did install the 2.6.34 intel-drm kernel yesterday but I had too many issues with samba 'panic action' script, /usr/share/samba/panic-action, ssh access etc..) If you want I can install this kernel http://kernel.ubuntu.com/~kernel-ppa/mainline/drm-intel-next/2010-06-02-lucid/ and test with my xorg.conf: Section "ServerLayout" Identifier "Default Layout" Screen "TV" EndSection Section "Device" Identifier "Intel HD Graphics" Driver "intel" BusID "PCI:0:2:0" Option "monitor-VGA1" "none" Option "monitor-DP1" "none" Option "monitor-HDMI1" "none" Option "monitor-HDMI2" "TSB-TV" Option "monitor-HDMI3" "none" Option "monitor-DP2" "none" Option "monitor-DP3" "none" #Option "UseEdidFreqs" "FALSE" #Option "ConnectedMonitor" "TSB-TV" #Option "CustomEDID" "TSB-TV:/etc/X11/TSB0103.bin" Option "ModeDebug" "true" EndSection Section "Monitor" Identifier "TSB-TV" VendorName "Toshiba" ModelName "42WLG66" #Option "TV_FORMAT" "NTSC-M" #Option "TV_Connector" "Component" # This optional entry specifies whether the monitor should be turned on at startup. By default, the server will attempt to enable all connected monitors. Option "enable" "true" HorizSync 15.0 - 46.0 VertRefresh 49.0 - 122.0 DisplaySize 1920 1080 Modeline "1920x1080i" 74.25 1920 2448 2492 2640 1080 1084 1094 1125 interlace +hsync +vsync Modeline "1920x1080" 74.25 1920 2448 2492 2640 1080 1084 1094 1125 +hsync +vsync Modeline "720x480_59.9" 27.00 720 736 798 858 480 489 495 525 -hsync -vsync Modeline "1920x1080_28.00" 73.79 1920 1968 2160 2400 1080 1081 1084 1098 -HSync +Vsync Modeline "1280x720_50.0" 74.25 1280 1720 1760 1980 720 725 730 750 +hsync +vsync Option "PreferredMode" "1920x1080" #This optional entry specifies that the monitor should be ignored entirely, and not reported through RandR. This is useful if the hardware reports the presence of outputs that do not exist. #Option "Ignore" "true" EndSection Section "Screen" Identifier "TV" Device "Intel HD Graphics" Monitor "TSB-TV" DefaultDepth 24 SubSection "Display" Depth 24 Virtual 1920 1080 Modes "1920x1080i" "1920x1080" "1920x1080_28.00" "1280x720_50.0" "720x480_59.9" EndSubSection EndSection Section "Monitor" Identifier "none" Option "Ignore" "true" EndSection FYI I did open another ticket: http://lists.freedesktop.org/archives/intel-gfx/2010-June/007018.html
I think this is supported now.
At least partially: commit 734b4157b367d66405f7dab80085d17c9c8dd3b5 Author: Krzysztof Halasa <khc@pm.waw.pl> Date: Tue May 25 18:41:46 2010 +0200 drm/i915: Add support for interlaced display. This doesn't change the clock limits (minimums), i.e. it won't make it output 720x576 PAL nor 720x480 NTSC, but it will work with modes like 1080i etc. (including GLX and textured Xvideo, not sure about the overlay). Tested on i915 + analog VGA, it would be worth checking if newer chips (and which ones) still support interlaced mode. Signed-off-by: Krzysztof Halasa <khc@pm.waw.pl> Signed-off-by: Eric Anholt <eric@anholt.net>
Hello, What is the status on this? I just installed 2.11 intel drivers with a 2.6.34 kernel and it doesn't work. Last time I did try with a 2.6.35-rc2 but it was still not working. Is this supposed to work also on Intel HD Graphics? If someone has been able to make this work, will it be possible to share some details on how to and maybe the xorg.conf? Many thanks XabiX PS: do we know if this is going to be ported into Lucid? or/and Maverick
I've been running into this bug on a machine I just put together with a Core i3 Clarkdale chip (using the integrated graphics). I'm connecting the computer to an HDTV that supports 720p and 1080i. The connection is done over HDMI. When I turn on the computer, it boots up and the kernel driver selects 720p for console-mode before X is launched, and when X starts up, it auto-selects 720p as well. Running xrandr reinforces this, as it only lists 720p and 540p modes as available. I looked through the commit to drm-intel mentioned above to see what it actually changed. It made a change to intel_display.c and a small but significant change to intel_crt.c, where it changed interlace_allowed (in the intel_crt_init function) from 0 to 1. I looked through some other places in the source, and found the same line (in an equivalent function) in intel_hdmi.c, where interlace_allowed was equal to 0. I changed it to a 1, recompiled, and rebooted with the new kernel. The i915 kernel driver selected 1080i for the console, which appeared vertically stretched, with only the top half of the console showing up on-screen (stretched vertically by a factor of two). X chose 1080i as well, also only displaying the top half, but vertically stretched to fill the screen. Switching back to 720p (using xrandr) worked fine. If interlacing does work with the i915 driver on CRT monitors (and the patch to enable it was only 15 lines or so), enabling interlacing over HDMI shouldn't be too difficult, I just don't quite have the programming know-how or familiarity with the i915 source to do it myself. I can provide any testing/diagnostics needed to someone who can pull it off, though.
(In reply to comment #15) > I've been running into this bug on a machine I just put together with a Core i3 > Clarkdale chip (using the integrated graphics). > > I'm connecting the computer to an HDTV that supports 720p and 1080i. The > connection is done over HDMI. When I turn on the computer, it boots up and the > kernel driver selects 720p for console-mode before X is launched, and when X > starts up, it auto-selects 720p as well. Running xrandr reinforces this, as it > only lists 720p and 540p modes as available. > > I looked through the commit to drm-intel mentioned above to see what it > actually changed. It made a change to intel_display.c and a small but > significant change to intel_crt.c, where it changed interlace_allowed (in the > intel_crt_init function) from 0 to 1. I looked through some other places in > the source, and found the same line (in an equivalent function) in > intel_hdmi.c, where interlace_allowed was equal to 0. I changed it to a 1, > recompiled, and rebooted with the new kernel. The i915 kernel driver selected > 1080i for the console, which appeared vertically stretched, with only the top > half of the console showing up on-screen (stretched vertically by a factor of > two). X chose 1080i as well, also only displaying the top half, but vertically > stretched to fill the screen. Switching back to 720p (using xrandr) worked > fine. > > If interlacing does work with the i915 driver on CRT monitors (and the patch to > enable it was only 15 lines or so), enabling interlacing over HDMI shouldn't be > too difficult, I just don't quite have the programming know-how or familiarity > with the i915 source to do it myself. I can provide any testing/diagnostics > needed to someone who can pull it off, though. Have you tried with the latest 2.13 drivers from http://intellinuxgraphics.org/2010Q3.html ? According to the release note at http://lists.freedesktop.org/archive...er/008172.html "Krzysztof Halasa (1): Allow interlaced modes." It looks like this should be enabled/authorized by default in the source. Unfortunately I am not that skilled so I am waiting for the new drivers to be available in Ubuntu launchpad to test. what is strange is that there is no much communication for intel drivers dev team around if this has been really tested or not? We may not be that many people hitting this issue :-( (bc we have old TVs :) )
http://lists.freedesktop.org/archives/intel-gfx/2010-September/008192.html
(In reply to comment #16) > (In reply to comment #15) > > I've been running into this bug on a machine I just put together with a Core i3 > > Clarkdale chip (using the integrated graphics). > > > > I'm connecting the computer to an HDTV that supports 720p and 1080i. The > > connection is done over HDMI. When I turn on the computer, it boots up and the > > kernel driver selects 720p for console-mode before X is launched, and when X > > starts up, it auto-selects 720p as well. Running xrandr reinforces this, as it > > only lists 720p and 540p modes as available. > > > > I looked through the commit to drm-intel mentioned above to see what it > > actually changed. It made a change to intel_display.c and a small but > > significant change to intel_crt.c, where it changed interlace_allowed (in the > > intel_crt_init function) from 0 to 1. I looked through some other places in > > the source, and found the same line (in an equivalent function) in > > intel_hdmi.c, where interlace_allowed was equal to 0. I changed it to a 1, > > recompiled, and rebooted with the new kernel. The i915 kernel driver selected > > 1080i for the console, which appeared vertically stretched, with only the top > > half of the console showing up on-screen (stretched vertically by a factor of > > two). X chose 1080i as well, also only displaying the top half, but vertically > > stretched to fill the screen. Switching back to 720p (using xrandr) worked > > fine. > > > > If interlacing does work with the i915 driver on CRT monitors (and the patch to > > enable it was only 15 lines or so), enabling interlacing over HDMI shouldn't be > > too difficult, I just don't quite have the programming know-how or familiarity > > with the i915 source to do it myself. I can provide any testing/diagnostics > > needed to someone who can pull it off, though. > > Have you tried with the latest 2.13 drivers from > http://intellinuxgraphics.org/2010Q3.html ? > > According to the release note at > http://lists.freedesktop.org/archive...er/008172.html > > "Krzysztof Halasa (1): > Allow interlaced modes." > > It looks like this should be enabled/authorized by default in the source. > > Unfortunately I am not that skilled so I am waiting for the new drivers to be > available in Ubuntu launchpad to test. > > what is strange is that there is no much communication for intel drivers dev > team around if this has been really tested or not? > We may not be that many people hitting this issue :-( (bc we have old TVs :) ) I forgot to mention the fact that I'm using the latest xf86-video-intel (from git) and the latest drm-intel-next (also from git). As far as I can tell from that release note (and the diffs associated with it), the change was only made for CRT outputs. Making the change to intel_hdmi.c that was made to intel_crt.c makes the kernel output the interlaced modes to X, but they're getting displayed incorrectly (and I'm assuming there's some small change that needs to be made to fix it, I just don't know what it is).
Is the issue still there with the latest Intel graphics stack?
I can check and get back to you in the next few days - that machine is currently disconnected as I'm in the process of moving.
Hi all, I can confirm the bug still exists with the 2.14.0 X drivers and Kernel 2.6.38.2 on Ubuntu Natty 11.04 using an I945GC chipset IGP. I have almost the same symptoms as Comment #15 (The two fields are displayed at once on the screen separated by a black bar in the middle). I think this is due to a bug which makes the V Active Pixels etc ... divided by 2 twice, by searching on google I've seen the exact same bug on old radeon driver and it was due to a fact that xf86setmode_crtc divide the vertical coordinates by 2 and the driver did it too (I don't know if this is relevant, just trying to provide insights here). I'm trying to drive an OEM car monitor, which accepts 15.7kHz/60Hz composite sync interlaced 480 lines, so I had to change the minimum clock limits in intel_display.c, I changed I9XX_VCO_MIN 1400000 to 1000000 and I9XX_DOT_MIN 20000 to 12000. The hardware combination I'm using worked fine under Windows with IEGD and PowerStrip, so it's not a hardware limitation. Right now, I tried doubling the Vertical coords in the modeline from 480 to 960 etc ... And I got a fullscreen picture, where the mouse cursor is displayed perfectly fine, indicating the video signal is good, but the rest of the image is very quickly scrolling from right to left and divided by 4, indicating probably a buffer problem. Hope that helps ... Don't hesitate to contact me if you need more information. Regards, Adrien ASESIO.
Oh, I forgot to say I'm using the analog VGA port. ;)
Created attachment 52886 [details] Xorg log 1920x1080i This is still not working as far as I can tell. No errors in the attached log, but the TV says "Picture format unavailable" which unfortunately isn't terribly useful for knowing exactly what mode has been programmed. I tried all the interlaced modes in the EDID, no luck. All with latest git master linux/Xorg/DDX.
I just tried booting with "video=VGA-1:1920x1080@60ie video=LVDS-1:d", I don't get a valid mode on my TV, but interestingly I do get: streamer ~ # cat /sys/class/graphics/fb0/modes U:1920x1080p-0 *notice it hasn't actually created an interlaced mode!
Patch here: http://lists.freedesktop.org/archives/intel-gfx/2012-January/014359.html
Patch version 2 here: http://lists.freedesktop.org/archives/intel-gfx/2012-January/014504.html
(In reply to comment #26) > Patch version 2 here: > http://lists.freedesktop.org/archives/intel-gfx/2012-January/014504.html These patches could fix this SDVO-HDMI interlace issue. Tested-by: Sun Yi <yi.sun@intel.com>
Bumping to dri as I think that is the current blocker.
just an fyi: interlaced support for sdvo, hdmi, dp and vga on gen3+ is on track to get merged to 3.4.
Patches are merged into 3.4. If you have individual issue, please open a new bug report.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.