Hello, I'm using my PAL modeline to drive my TV directly over the VGA connector. When upgrading from Ubuntu Feisty to Ubuntu gutsy, this stopped working for me. The first half of the screen is shown fine, then there's a horizontal black bar and then the first half of the screen is shown again in the lower half of my TV set. I can tell that it's divided in the middle because I see half of the mouse pointer which is usually placed in the middle of the screen. When I downgraded from xserver-xorg-video-ati-6.7.193 to xserver-xorg-video-ati-6.6.3, it started to work again. In an attempt to narrow down the cause, I got xf86-video-ati from git. I have 82 revisions left for testing, but I'll provide my interim results here: Commit 2618cf2aa8ed76411b943eb90c95869814c2f151 from May still works Commit 0abce69f0d826a7ca1a41d963cd4730b6e01c145 from April is broken I didn't use git-bisect for that because I'd keep hitting revisions which wouldn't compile properly or exhibited other problems. So, if anyone has a fix or knows which revisions might be interesting, I'd appreciate any input. If not, I'll give git-bisect another try tomorrow (with my narrowed-down set of revisions this time). Thanks, Michael
(In reply to comment #0) > Commit 2618cf2aa8ed76411b943eb90c95869814c2f151 from May still works > Commit 0abce69f0d826a7ca1a41d963cd4730b6e01c145 from April is broken Sorry, I got that backwards. 0abce69f0d826a7ca1a41d963cd4730b6e01c145 is working 2618cf2aa8ed76411b943eb90c95869814c2f151 is broken Also, I filed a bug in Ubuntu a few days ago: https://bugs.launchpad.net/xorg-server/+bug/144322
Hi again, > I didn't use git-bisect for that because I'd keep hitting revisions which > wouldn't compile properly or exhibited other problems. I tried to narrow down the problem further, but I've been unable to do so. git-bisect will give me revisions which won't compile. If i pick revisions at random, they will either lock up my box when I start X or do other weird stuff. So, I'm afraid I need some help. If anyone has a a hint how I can debug this further or if a developer is looking at this, please let me know. I'm willing to test revsions and patches, but I cannot do so without help. FYI, this problems also shows up with a X300: http://www.gossamer-threads.com/lists/mythtv/users/294492#294492 Regards, Michael
*** Bug 12727 has been marked as a duplicate of this bug. ***
I'm the X300 owner, and I am indeed seeing the same bug, the screen split in half with the top copied to the bottom and the vblank in the middle. I am also using a VGA->SCART cable to connect my X300 to a analog 16:9 PAL tv. I can get a image on the tv if I remove the "interlace" keyword, and choose a progressive modeline with half the scanlines of my interlaced scanline.
Does disabling tiling fix it? Option "ColorTiling" "FALSE"
No, disabling Colortiles doesn't work for me. Although it clearly says that it's disabled in Xorg.1.log: (**) RADEON(0): Option "NoAccel" "true" (**) RADEON(0): Option "SWcursor" "on" (**) RADEON(0): Option "IgnoreEDID" "true" (**) RADEON(0): Option "ForceMinDotClock" "12MHz" (**) RADEON(0): Option "ColorTiling" "false" (**) RADEON(0): Option "RenderAccel" "false" (**) RADEON(0): Option "VGAAccess" "false" (**) RADEON(0): Option "DRI" "false" I still have the same problem. For this test, I used xserver-xorg-video-ati from Ubuntu in 1:6.7.195-1ubuntu2 because I don't expect GIT HEAD to show much difference.
can you use radeontool to dump the regs as set by the old driver and the new driver using this version of radeontool: http://cgit.freedesktop.org/~airlied/radeontool/ radeontool regmatch '*' and attach both dumps.
Created attachment 11991 [details] registerdump when using the broken driver
Created attachment 11992 [details] regdump using the working driver
Done!
Can you also attach your xorg log from the old working driver? Also, which connector are you using for the interlaced output?
Can you try again with the latest ati git? I think this should be fixed.
Alex, > Can you try again with the latest ati git? I think this should be fixed. Sorry for not getting back to you earlier. Using latest git, the X server just crashes. I'll attach my Xorg.log in a minute. I've also tried your commits from 15th of this month and that's also crashing.
Created attachment 12100 [details] Xorg log of crash using latest git
Let me know if you need a proper backtrace using gdb. Do you still need a xorg.log using the working driver?
(In reply to comment #15) > Let me know if you need a proper backtrace using gdb. > > Do you still need a xorg.log using the working driver? > yes on both. thanks.
Created attachment 12109 [details] gdb backtrace for segfault with latest git I hope this backtrace is OK, I don't know a lot about these things. (The first time I tried getting one my box froze, worked the second time. phew.)
Created attachment 12111 [details] [review] Xorg.1.log of working driver, rev 0abce69f0d826a7ca1a41d963cd4730b6e01c145 Here's the missing log file. Let me know if you need anything else. Thanks for working on this, btw.
(In reply to comment #17) > Created an attachment (id=12109) [details] > gdb backtrace for segfault with latest git > > I hope this backtrace is OK, I don't know a lot about these things. (The first > time I tried getting one my box froze, worked the second time. phew.) > Can you try again with tiling disabled? Option "ColorTiling" "FALSE"
(In reply to comment #17) > Created an attachment (id=12109) [details] > gdb backtrace for segfault with latest git this crash should be fixed now.
I've just tried with latest git. * with color tiling disabled, it is still broken (same behavior as before) * with color tiling enabled, the screen is split in half like before, but there are horizontal black lines trhoughout the picture, maybe a few pixels high. Judging from the noise, my TV set doesnt really like that :)
Created attachment 12119 [details] regdump using latest git rev a9306b7986467c0794c91625ad2ece597b718462 Here's a register dump just in case you need it. Xorg log is coming...
Created attachment 12120 [details] Xorg log using latest git rev a9306b7986467c0794c91625ad2ece597b718462
*** Bug 13057 has been marked as a duplicate of this bug. ***
Hi Alex, since Gutsy has been released in the meantime, more people are getting bitten by this bug (eg #13057), is there anything we can do to help getting this fixed? Regards, Michael
hello, I posted that dupe bug, I am happy to see this issue is already worked on, I have it since 2 weeks but didn't had time to report, and couldn't find on google anything about that. I use a radeon 9600pro. by the way I might be off topic but I am taking the occasion to ask here if anyone manage to use this PAL modelines to obtain a valid display on the framebuffer, I mean on the non X display in console. I see something on the output, but the image appear only on half of the screen is not stable at all, scrolling up and down very fast. It's the same on Windows XP login. I tried things using radeonfb and /etc/fbdev.conf but it was worse nothing appeared on the screen. Also a bit of topic too but I guess you noticed as well, composite option in the modeline have no effect on the fglrx driver. I am interested you point me a place whith discussions about this VGA -> PAL issues. bye. (In reply to comment #25) > Hi Alex, > > since Gutsy has been released in the meantime, more people are getting bitten > by this bug (eg #13057), is there anything we can do to help getting this > fixed? > > Regards, > > Michael >
Same problem here. I notice that radeon_modes.c has changed significantly, is the interlace flag still being picked up from the mode line so that V_INTERLACE gets set?
Sorry, I meant mode->Flags gets V_INTERLACE set.
Has anyone tried using xrandr --newmode <modeline> to add the interlaced modeline. I was going to try it but I'm working remotely and the X server just locked up the box! :-(
(In reply to comment #29) > Has anyone tried using xrandr --newmode <modeline> to add the interlaced > modeline. I was going to try it but I'm working remotely and the X server just > locked up the box! :-( > No, I haven't tried it yet. However, I have installed the latest git as of today to see if the issue is still there. Yes, screen is still garbled :/ This time however, I don't get the same screen two times as described above. It's even more garbled, now I see the upper third of the screen three times. Any ETA on a fix? I know you guys are busy, I just want to know when I should start nagging again :) Do you need any additional logs? SSH access to my box?
Like the topic starter, I ran into same problem trying to get RGB output from VGA connector in PAL timings (15625Hz horizontal 50Hz vertical interlaced): When this error occurs, my CRT OSD displays 100Hz as the vertical frequency (should be 50)! Also, hooking up an oscilloscope to VGA Vertical output confirms this 100Hz frequency. I think I can explain the screen as described by topic starter: The black horizontal border in the middle of the screen is just the extra Vsync pulse. After this extra Vsync pulse the vertical counters are reset, and the top half of the screen is repeated in lower half. ====>>>Workaround to get interlaced PAL output using 6.7.196 At latest geexbox developer version , xorg has DRI stuff included and I no longer managed to compile 6.6.3 in it:-( However, I stumbled on a workaround: 1) Start Xorg using PAL interlaced setting. RGB output is like reported bug 2) use xrandr -display :0.0 -s 1 to switch to different (also PAL interlaced) setting. Now RGB output is OK ! Even switching back to startup config using xrandr -display :0.0 -s 0 , PAL output is valid! I have these lines in my xorg.conf HorizSync 15-16 VertRefresh 49-51 Modeline "PALWS" 24.75 1280 1321 1437 1584 576 580 585 625 -hsync -vsync interlace Modeline "PAL" 22.50 1152 1195 1301 1440 576 580 583 625 -hsync -vsync interlace Intention of these 2 PAL modelines: My vga to scart convertor uses vertical sync length (note 585 and 583 in Vertical part of modeline) to switch the scart 16:9 pin.
Just to note this is affecting my Radeon X550 after upgrading to Fedora 8 xorg-x11-drv-ati.x86_64-6.8.0-1.fc8 this has worked fine from fedora5 to fedora7 I'm using VGA->SCART to a PAL TV on the VGA port (port 0) nothing connected to DVI or STV ports (1 and 2) What is actually being displayed is the odd scanrows from the top half of the screen double-scanned then a thick black + thin white vsync pulse in the centre of the screen followed by the even scanlines from the top half of the screen double-scanned. The top half of the screen though shown twice, is not shrunken in height. I confirmed the double-scanning by setting a special wallpaper which consists of (G)reen/(M)agenta/(B)lack pixels as follows BBBBBBBBBBBBBBBBMMMMMMMMMMMMMMMMM GGGGGGGGGGGGGGGGBBBBBBBBBBBBBBBBB BBBBBBBBBBBBBBBBMMMMMMMMMMMMMMMMM GGGGGGGGGGGGGGGGBBBBBBBBBBBBBBBBB normally this gives a horrid interlaced flicker, but I used it previously to demonstrate that interlace is working properly. With 6.8.0 the top half of the display shows as pure BBBBBBBBBBBBBBBBBMMMMMMMMMMMMMMMM BBBBBBBBBBBBBBBBBMMMMMMMMMMMMMMMM and the bottom has as pure GGGGGGGGGGGGGGGGGGBBBBBBBBBBBBBBB GGGGGGGGGGGGGGGGGGBBBBBBBBBBBBBBB I can't get any joy using the xrandr trick as suggested above
Previous version that worked for me was xorg-x11-drv-ati-6.6.3-4.fc7.x86_64.rpm
Created attachment 15187 [details] [review] Fix for radeon interlace modes You'll be interested to know I've written a patch to fix this, by doubling the values of the vtotal/vdisplay/vsyncstart CRTC registers when in interlaced modes, the same should probably be done for CRTC2 registers. I've just reported this via Fedora's bugzilla as my patch applies to the Fedora source after their other patches, hopefully it'll find its way upstream from there, or you can tweak the patch to apply it for other distros, I've attached it here for reference https://bugzilla.redhat.com/show_bug.cgi?id=437700
(In reply to comment #34) > Created an attachment (id=15187) [details] > Fix for radeon interlace modes > > You'll be interested to know I've written a patch to fix this, by doubling the > values of the vtotal/vdisplay/vsyncstart CRTC registers when in interlaced > modes, the same should probably be done for CRTC2 registers. I've just tested interlaced modes with the ati driver currently in git on several cards (RV280, RV350, RS690) and it works perfectly as is. Can you verify that you are still having a problem with the ati driver currently in git? Note that there are still issues with tiling when using multiple crtcs, but that is being sorted out now.
(In reply to comment #35) > I've just tested interlaced modes with the ati driver currently in git on > several cards (RV280, RV350, RS690) and it works perfectly as is. Can you > verify that you are still having a problem with the ati driver currently in > git? I'll see if I can figure out git, and try to build from there, have to confess my main interest is in distro released versions. > Note that there are still issues with tiling when using multiple crtcs, > but that is being sorted out now. Multi-crtc problems wouldn't show up on this machine, the only output is a TV.
I'm glad finally a coder looks into this, and patched code surfaces. Although the recent posted patch will work for most users, imho it's not a proper patch. Over here I didn't use git,but applied the patch to 6.8.0 source. On my radeon7000 setup, the patch works fine when starting xorg into this mode: Modeline "PALWS" 24.75 1280 1321 1437 1584 576 580 585 625 -hsync -vsync interlace However I do have an additional PAL modeline in xorg.conf: Modeline "PAL" 22.50 1152 1195 1301 1440 576 580 583 625 -hsync -vsync interlace Using xrandr to switch from using the 1st modeline to 2nd gives wrong PAL output! The screen OSD reads 15k-below30Hz, where it should shoud 15k-50Hz. This mode suggest Vregisters are set to twice required value It seems to me that somewhere a bug is introduced after V6.6.3, that somehow halves the Vregister settings on interlace mode. On my radeon 7000, this bug only shows up when starting an interlaced mode, not after switching from an interlaced mode to another interlaced mode. (that was my workaround, start interlaced, then jump to another interlaced mode) "Hey, this pipe line is leaking half of the oil!" "Then start pumping twice as much to get the required output." Obvious that'll give required output , but (as environmentalists will agree) isn't the way to fix it ;-)
(In reply to comment #35) > I've just tested interlaced modes with the ati driver currently in git on > several cards (RV280, RV350, RS690) and it works perfectly as is. Can you > verify that you are still having a problem with the ati driver currently in > git? Just checked out the ATI driver sources with git-clone from git://anongit.freedesktop.org/git/xorg/driver/xf86-video-ati Ran autogen and make, killed X server, copied over the newly built driver and re-started X server. I can confirm the same issue exists that I originally had. Will attach config and log files.
Created attachment 15295 [details] xorg.conf
Created attachment 15296 [details] Log file from git 19-MAR-2008
(In reply to comment #38) > (In reply to comment #35) > I can confirm the same issue exists that I originally had. Perhaps it's just an issue with PAL modes you are using. Is there any chance you can try an interlaced mode (like 1024x768i) on a monitor? Those modes work fine on all my cards, but I don't have a TV to test you PAL modes on. Perhaps forcing a lower min clock would help?
(In reply to comment #41) > you can try an interlaced mode (like 1024x768i) on a monitor? Connected a CRT and used the "industry standard" modeline ModeLine "1024x768" 44.9 1024 1032 1208 1264 768 768 776 817 +hsync +vsync Interlace Unfortunately, the output is not correct. Only the top half of the screen is displayed, stretched to fill the whole screen, the individual scanlines are quite widely separated, the monitor seems able to lock on to the "double frequency" sync pulse better than the TV, as you'd expect from a multisync device I suppose, so there isn't a black sync bar visible on screen. Monitor OSD reports frequencies as horizontal 35kHz, vertical 174Hz > I don't have a TV to test you PAL modes on. Perhaps > forcing a lower min clock would help? The 13MHz clock has worked with all previous drivers for more than 2 years, but I will certainly try forcing it even lower. Config/Log files required for the 1024x768i test?
With my "vertical * 2" patch I get a proper picture on the CRT monitor using 1024x768i modeline, however the monitor OSD reports 87Hz vertical frequency. I tried a "vertical * 4" version of my patch, to see if the monitor OSD would show the expected 43.5Hz but the monitor refused to show any picture.
(In reply to comment #43) > the monitor OSD reports 87Hz vertical frequency. OK realised that the OSD must be displaying the field rate, rather than the frame rate so those numbers are OK.
(In reply to comment #41) > Perhaps it's just an issue with PAL modes you are using. I think all the people reporting this issue are using PAL modelines. > Perhaps forcing a lower min clock would help? Tried using 12.0MHz (which I beleive is the lowest the drive will allow) unfortunately no difference.
I realised I hadn't done any non-interlaced tests, in the case of my TV it doesn't support non-interlaced modes, but I have the CRT monitor available, so I tried 640x480 mode with it, this works fine (with and without my patch, and as expected fails to produce a valid display if I make my patch *NOT* unconditional instead of checking for V_INTERLACE) It seems to me from looking at the register dumps by Michael Haas and from debug prints I put into the driver that *something* has halved the values of vertical registers for interlaced modes, presumably this is happening when the mode values are calculated (or adjusted) rather than (at the stage I inserted my hack) when they are poked into CRTC registers. I tried a git bisect, but rapidly ran back to versions of the driver that the X server refused to load (version compatibility with my server or drm module?) Also I've looked "by eye" at the diffs between 0abce69f0d826a7ca1a41d963cd4730b6e01c145 and 2618cf2aa8ed76411b943eb90c95869814c2f151 but the "merge and fix conflicts" in 76670f665ebec7cdf40a04bf9379cb3ad4417507 are a bit too much for me to follow manually, so much other development has happened since that change, makes it difficult to follow code that has been split into other files. Perhaps this problem only affects certain cards, any clue how I can proceed further?
Andy, since the new randr code went in the current driver uses completely different code paths for modesetting, a bisect is only of limited usefulness due to this. I don't believe the new code was properly tested for the interlaced (initial mode) case we need. You need to look at what the old (working) code did when initialising interlaced modes and ensure the current code does the same. I did have a look myself previously but couldn't see what was missing. I'll have another look, as you say it's probably in the timings calcs...
OK. It's obviously: xf86SetModeCrtc(adjusted_mode, INTERLACE_HALVE_V) not getting called in every case it should. It seems to get called unconditionally in the i810 driver...
(In reply to comment #48) > OK. It's obviously: xf86SetModeCrtc(adjusted_mode, INTERLACE_HALVE_V) not > getting called in every case it should. Sounds likely, but rather it's being called when it shouldn't be, I'll have to pull down some old git trees for comparison. Still doesn't explain why Alex sees interlace modes with a standard monitor work and I don't (without re-doubling the verticals).
Perhaps it's only not working if the initial start-up mode is interlace? I'm unable to test at the moment since I'm having trouble building the xserver...
(In reply to comment #50) > Perhaps it's only not working if the initial start-up mode is interlace? I'm > unable to test at the moment since I'm having trouble building the xserver... > I'm able to add modes and switch between interlaced/non-interlaced on the fly via xrandr.
So I guess the question is how does the code path differ between randr and startup. Perhaps the bug is related to where initial mode stucts get filled in, particularly where the xorg.conf modelines get added rather than mode programming itself?
(In reply to comment #51) > I'm able to add modes and switch between interlaced/non-interlaced on the fly > via xrandr. I set my default mode to 800x600 non-interlaced and started X, then did xrandr --newmode 720x576 13.88 720 741 814 888 576 580 587 625 +hsync +vsync interlace composite xrandr --addmode VGA-0 720x576 It seems OK at this stage, xrandr lists the new mode, but when I do xrandr --output VGA-0 --mode 720x576 it crashes my whole machine (the display does change to black), I'll see if I can get any sensible crashdump out of a serial console.
Hi, is anyone still looking into this? I have the same problem, and while xrandr does the job, it should be working from the start... I was looking at how the radeon module initializes, and i wonder if perhaps RADEONCrtcFindClosestMode() doesn't honour the interlace flag. All the commits between the working and non-working version seem pretty harmless, but something must have been changed :(
(In reply to comment #54) > Hi, is anyone still looking into this? Not had chance to look for a while, my "hacky patch" works for me, but is clearly not an acceptable solution. > I have the same problem, and while xrandr does the job, it should be working > from the start... I never got an interlaced mode running with xrandr, can you give me the exact steps you used to do that? what mode did you start X in? did you just switch to an interlaced mode pre-defined in your xorg.conf? or did you define a new mode with xrandr and then switch to it? > All the commits > between the working and non-working version seem pretty harmless, but something > must have been changed :( Yep, but I rapidly got lost in them, doesn't help that my TV is the only monitor on this machine, hardly makes it convenient for editing/viewing large chunks of source code ...
I have two working modes on my Radeon 7200 (R100): ModeLine "720x576PAL" 15.125 720 770 842 968 576 579 607 625 Composite Interlace Modeline "720x540PAL" 15.101 720 770 842 968 540 565 570 624 Composite Interlace Additionally I define the virtual size, but i'm not really sure if it makes a difference: Section "Screen" SubSection "Display" Virtual 720 576 $ export DISPLAY=":0.0" $ xrandr -s 1 $ xrandr -s 0 And that's what i need to get a normal picture... I'm thinking of using your patch, but an upstream fix would be very welcome :)
Hello, I can confirm that this bug still exists in todays git. I'm using an interlaced mode. As output a TV is connected via SCART. I see the picture doubled with a black line in the middle. The picture is streatched 2 times, i.e. I see two times the first quarter of the picture. Horizontally the picture is correct. My hardware is a Gigabyte Mobo with a X1250 IGP. It would be great if someone could try to fix this. The patch above doesn't work for me. Thanks Mike
(In reply to comment #57) > > My hardware is a Gigabyte Mobo with a X1250 IGP. I just fixed some issues with interlaced and avivo cards in git. So the x1250 should work fine now with ati from git master. As to pre-avivo chips, unfortunately, I cannot reproduce any problems with with interlaced modes here.
I think vsync is calculated wrong when an interlaced modeline is read, since xrandr sets it correctly. Anything i can do to help you? > As to pre-avivo chips, > unfortunately, I cannot reproduce any problems with with interlaced modes here.
Created attachment 17820 [details] Picture of X1250 with interlaced mode showing VDR.
Created attachment 17821 [details] Picture of X1250 with interlaced mode showing plain X screen.
(In reply to comment #58) Sorry but the latest git shows exactly the same behavior like before. I've attached two screenshots showing the problem.
With this patch I get on my hardware (X1250 IGP) at least the same behavior like the others, i.e. after switching resolution with xrandr I have a correct picture: diff --git a/src/atombios_crtc.c b/src/atombios_crtc.c index 70650e1..18885e6 100644 --- a/src/atombios_crtc.c +++ b/src/atombios_crtc.c @@ -493,7 +493,7 @@ atombios_crtc_mode_set(xf86CrtcPtr crtc, OUTREG(AVIVO_D1GRPH_X_END + radeon_crtc->crtc_offset, x + mode->HDisplay); OUTREG(AVIVO_D1GRPH_Y_END + radeon_crtc->crtc_offset, y + mode->VDisplay); OUTREG(AVIVO_D1GRPH_PITCH + radeon_crtc->crtc_offset, - crtc->scrn->displayWidth); + crtc->scrn->displayWidth << ((adjusted_mode->Flags & V_INTERLACE)?1:0)); OUTREG(AVIVO_D1GRPH_ENABLE + radeon_crtc->crtc_offset, 1); OUTREG(AVIVO_D1MODE_DESKTOP_HEIGHT + radeon_crtc->crtc_offset,
(In reply to comment #63) This patch is not quite correct. Now I can see the whole picture, but I can see only every second line. That means that the lines are doubled by the hardware (like doublescan) and by doubling the pitch in the patch I skip every second line. I played around with the line: src/atombios_crtc.c:655 OUTREG(AVIVO_D1MODE_DATA_FORMAT + radeon_crtc->crtc_offset, AVIVO_D1MODE_INTERLEAVE_EN); It makes no difference if the AVIVO_D1MODE_INTERLEAVE_EN bit is set or not.
How are you adding the interlaced mode? Is it part of the edid from your monitor or a modeline or added at runtime using xrandr? As has been noted, I suspect the xserver may be mangling interlaced modes. Can you attach your xorg log?
Created attachment 17827 [details] Xorg.log for Interlaced mode with two xrandr -s x calls This is the Xorg.log when starting the X-server in interlaced mode. Short after starting I switch the resolution with xrandr -s 1 ; xrandr -s 0.
Created attachment 17828 [details] xorg.conf for X1250 igp with interlaced mode. This is the xorg.conf. The interlaced mode is defined by myself. The same timing works with fglrx. radeonhd doesn't like this mode at all. Only difference to fglrx is that I had to change sync polarity (+hsync -> -hsync ...)
I've had similar problems getting VGA->Scart (PAL RGB) to work. I have found one and only one way of getting a picture on my TV: add a PAL compatible mode in xorg.conf (not with xrandr), but make sure X starts with a non-interlaced mode on the VGA output (e.g. the automatically added 1024x768@60Hz VESA mode). This gives me an un-synced scrolling picture on the TV. Then use xrandr to change to the PAL mode added in xorg.conf. Any other procedure (starting X with the PAL mode set from the beginning or adding the mode with xrandr) just gives me a black screen (with sync if the mode I set is PAL compatible, without sync otherwise, but always just black). And the TV seems to display whatever it is sent even if it cannot sync to the signal, so it is probably sent a black picture by the graphics card. When it works, the mouse pointer is shown in the wrong position on the TV: at half the y position compared to the correct position. (I.e. if I want to click something at the bottom of the TV screen I have to position the mouse pointer in the middle of the TV screen.) The mouse pointer is not visible in "black mode". Using "X Window System Version 1.3.0" and xf86-video-ati-6.9.0 (or ati-git-080929; same result).
I opt to commit Andy's fix to git. Nobody new is going to come up with a proper fix for these old drivers. At least this makes it work for a few, and it has no drawbacks. If it breaks again in a future Xorg version we have a better chance of bisecting the real problem. This also prevents users with a similar problem but a different bug from diluting this report and never find help themselves.
While checking why the mode->CrtcV values are halve of what they should be (like they were in version 6.6.3) I came across the following comment in the function xf86InitialCheckModeForDriver (file hw/common/xf86Mode.c): /* * NOTE: We (ab)use the mode->Crtc* values here to store timing * information for the calculation of Hsync and Vrefresh. Before * these values are calculated the driver is given the opportunity * to either set these HSync and VRefresh itself or modify the timing * values. * The difference to the final calculation is small but imortand: * here we pass the flag INTERLACE_HALVE_V regardless if the driver * sets it or not. This way our calculation of VRefresh has the same * effect as if we do if (flags & V_INTERLACE) refresh *= 2.0 * This dual use of the mode->Crtc* values will certainly create * confusion and is bad software design. However since it's part of * the driver API it's hard to change. */ In version 6.6.3 the function RADEONPreInitModes (file src/radeon_driver.c) calls xf86SetCrtcForModes which in turn calls xf86SetModeCrtc for each mode with adjustFlag 0. However in version 6.9.0 the new function radeon_mode_fixup (file src/radeon_output.c) only recalculates the mode->Crtc* values for a few specific conditions. Namely when MonType is MT_LCD or MT_DFP and the display resolution is smaller then the panel resolution (and rmx_type is not RMX_OFF and either IS_AVIVO_VARIANT or radeon_crtc_id == 0 is true). This means that at least for CRT monitors the CrtcV* values are left halve from what they should be. To fix this I placed a call to xf86SetModeCrtc with adjustFlag 0 at the start of the radeon_mode_fixup function. But I'm not sure whether this is correct for all radeon cards because as I gather from the previous comments that this problem only seems to show up on older cards (I have a R350). Note that the function radeon_mode_fixup contains two calls to xf86SetModeCrtc with the INTERLACE_HALVE_V flag. But after the call the Crtc* values are recalculated (without adjusting for interlacing or double scanning). However the values CrtcVBlankStart, CrtcVBlankEnd, CrtcHBlankStart and CrtcHBlankEnd are left untouched. The problem with the pointer image placed to high is caused by the halving of the y coordinate in the function radeon_crtc_set_cursor_position when the V_INTERLACE flags is set. This wasn't done in version 6.6.3. Removing it fixes the problem. Note that the y coordinate is doubled when the V_DBLSCAN flag is set. This matched the doubling of the CrtcV values in xf86SetModeCrtc. But the y coordinate is not multiplied with mode->VScan. Also note the corrected values in the 'adjusted_mode' don't seem to be propagated to the various copies of the mode in the xserver data structures. The adjusted_mode is just dropped at the end of the xf86CrtcSetMode function. So when the function RADEONDisplayVideo (radeon_video.c) uses crtc->mode.CrtcVDisplay it gets the halved valued. Unfortunately this is not the cause of xv corruption I am experiencing on the second head (it is not related to interlacing). Attached a patch.
Created attachment 20529 [details] [review] Proposed patch to fix split screen and wrong pointer image placement
(In reply to comment #70) > While checking why the mode->CrtcV values are halve of what they should be > (like they were in version 6.6.3) I came across the following comment in the > function xf86InitialCheckModeForDriver That looks like a much better attempt to stop the problem happening, rather than my approach of fixing it after it has happened, I'm no longer using my VGA->SCART cable, having upgraded to HDMI and switched from Radeon PCIe to a motherboard with Intel video onboard.
You are a hero! This fixes the resolution and the mouse pointer. Even on my older card which uses the legacy part of the driver. I concern this bug closed \o/
pushed thanks! 065938617c0feab17f4274a5350de02a692ba065
Sweet. Will upgrade to Ubuntu 8.10 (as the 6.9.0 git won't compile on my 8.04)and try this on my arcade system tonight. I have the exact same probs with the driver as it is today. Let you know the result....
This fix solved my problem. Thanks.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.