- System: AOpen MiniPC MP965-DR - CPU/RAM: Core 2 duo T7500, 2GB ram - OS: Fedora 8 - X server: xorg 1.4.99.1-0.10.fc9 - intel_drv.so: built from git checkout 12/09 - mesa/drm drivers: built from git checkout 12/09 - system outputs: VGA, TMDS-1, LVDS, TV - TV: 32" Sony WEGA, NTSC 480i only This bug documents an individual problem - several problems were reported at once in #13611, and I am now separating them as requested. Notes: 1. This bug is a display artifact which is present on both the Composite and SVideo TV outputs in NTSC-M and NTSC-J modes. 2. This bug is not present on the Component output - this output appears perfect. 3. This bug appears as strong color pulsing affecting any non-black/white/grey area of the screen. 4. Changing the output resolution does not alter the effect of this bug.
Can you attach your xorg.conf?
Created attachment 13483 [details] xorg.conf
The xorg.conf used is the same as for the other bugs that I've filed (ie, 13743). I can only get the TV display working if I disconnect other outputs (VGA) before booting. If I boot with component connected it looks fine, but if I boot with either SVideo or composite I get this color pulsing.
I found the solution to this problem. After reading a lot about the NTSC standard, I hypothesized that the color burst frequency is incorrect. In the "tv_modes" table, there are some parameters called dda1_inc, dda2_inc, and dda2_size which determine the color burst frequency. The comment for the NTSC-M mode claims that the given values (136, 7624, and 20013 respectively) correspond with a frequency of exactly 3.58 MHz. By plugging these values into the equation in the code comments, a value of exactly 3.58 MHz is calculated. But this is actually the wrong value - the wikipedia entry for NTSC claims that the color burst frequency is 4.5 MHz * (455/572), and that TV receivers are very sensitive to inaccuracies in this signal. Working backwards from these numbers, I arrived at values of 136, 208, and 572 for ddu1_inc, dda2_inc, and dda2_size to generate a proper frequency of about 3.5795454 MHz. I still saw some color pulsing with these values on my Sony Wega SDTV (though much less than before), so I decreased the frequency a bit more by using a value of 200 instead of 208 for dda2_inc. With this value all of the color artifacts disappeared. I tested on 2 TVs. With the original values for subcarrier frequency, my Wega SDTV had strong color pulsing and my Zenith SDTV displayed no color at all, just a black/white image. But with the values of 136, 200, and 572, both TVs displayed a good picture with no color artifacts. I tested both SVideo and Composite signal outputs, and the results were the same. Therefore, I would recommend replacing the existing values of (136,7624,20013) with (136,200,572) in the NTSC-M, NTSC-J, and PAL-M mode descriptions. The frequency given in the comments should also be changed from 3.5800000 to 3.5795454.
Nanhai/Zhenyu, any comments for Richard's finding?
Seems that bug reporter has even helped us root caused the bug. should be easy to fix then in next step..
Michael, we now don't have any 965GM laptop with TV output, that's why I asked to buy one. yeah, it looks current we use 3.58Mhz subcarrier for NTSC-M, as noted by Keithp this gave him a more stable image on his TV set, although spec says 3.579545, and it might be good if our driver can handle it by some config. Anyway we need hardware available first for trying this.
Zhenyu, I talked with one of the hardware guys about this, I think we need to audit our driver's TV out settings against the latest b-spec. There may have been updates we missed. Also the TV scaler only supports a limited set of resolutions, so we should validate against the supported list as well (if we don't already, it's been awhile since I looked at that code).
True, I was thinking to add register blocks from vbios into our driver, but still need to find out the root of keith's subcarrier formula, or why it's broken.
Please test against current xf86-video-intel git master.
ping again
I can test this on the weekend. The problem that I anticipate is that my HTPC machine is running xserver 1.4 and I don't really want to build and install a new (1.5) version. Will the head of the git repo for this driver build and run properly under xserver 1.4? If not, can you recommend a distro that I could install which already comes with X 1.5?
Our driver always want to be compat with at least xserver 1.4, and building current 1.5 releases or git master only requires libdrm 2.4, which is compat with 2.3 too.
Okay, I tried to compile the latest from git and failed. xf86_video_intel wouldn't compile because it requires libdrm 2.4, and I only have 2.3 installed. So I updated the drm module to the tip of the git tree, and that one failed to build the i915.ko module in the linux-core directory. My kernel version is 2.6.23.9-85.fc8. What do I need to do to make it build the i915.ko module?
You are right to use libdrm from mesa/drm on freedesktop. But don't use drm kernel module from that (intel has stopped drm development there), instead, just use the drm kernel module shipped in a recent (2.6.27 or .28) kernel. See http://intellinuxgraphics.org/download.html Or you can try Fedora 10 rawhide.
I verified this bug is fixed with recent master with same hardware. So mark this as resolved. Richard, feel free to reopen if you still have issue on this one.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.