they seem all related, so i make only one report. all problems triggered by tvtime. xawtv and mplayer don't seem to make [that many] problems here. regressions since xfree 4.3: - the pixmap cache seems to be trashed by the overlay - painting causes the xv overlay to be disturbed already in xf 4.3: - moving the overlay in a way that makes it cross the screen borders destabilized the display when the window is in certain positions - maybe a tvtime bug, but there seems to be an uninitialized column of pixels at the right border of the overlay. maybe related to the alignment requirements of the chip. the display is running at 1280x1024@85hz, that is, at the limits of the overlay unit.
Here's a "me too". I have observed all of these issues on my two Intel-based systems. I have a Dell Inspiron 300m notebook, as well as a desktop box based on an Intel D865GBF motherboard. They have 82855GM (Centrino) and 82865G graphics controllers, respectively. I have a KDE desktop with the Baghira visual style (Bluecurve doesn't have this problem). The app I use with Xv is Kaffeine, a video player based on the Xine multimedia library. With i810 drivers from XFree86 4.4 as well as X.org, new windows drawn on the screen are corrupt and make the overlay corrupt as well. Replacing i810_drv.o with a binary from XFree86 4.3 (found on rpmfind.net, from Mandrake, XFree86-server-4.3-30mdk.i586.rpm) makes the problem go away. Gentoo Linux, X.org 6.8, vanilla kernel 2.6.9, 1024x768. Screenshot at: http://home.columbus.rr.com/andrewbarr/xv-i810.png
Another "me too" from me. xv seems to accidentally use the buffer space that is already used by offscreen areas for its overlay. (and BTW. glxgears crashes the X Server though glxinfo reports everything should be OK) 0000:00:02.0 VGA compatible controller: Intel Corp. 82852/855GM Integrated Graphics Device (rev 02) 0000:00:02.1 Display controller: Intel Corp. 82852/855GM Integrated Graphics Device (rev 02) Linux agpgart interface v0.100 (c) Dave Jones agpgart: Detected an Intel 855 Chipset. agpgart: Maximum main memory to use for agp memory: 408M agpgart: Detected 32636K stolen memory. agpgart: AGP aperture is 128M @ 0xe0000000 server:/root # lsmod | grep i9 i915 16768 1 Section "Device" Identifier "Intel I855GM" Driver "i810" VendorName "Intel Corp." BoardName "82852/855GM Integrated Graphics Device" BusID "PCI:0:2:0" EndSection XOrg 6.8.1 (Gentoo Linux x11-base/xorg-6.8.0-r4.ebuild)
Created attachment 1842 [details] Xorg running log, complete with lots of errors This shows Xorg starting up in dualhead mode, what look to be engine errors, and so on.
Created attachment 1846 [details] New driver This driver is compiled from the CVS head code. Ensure you back up your older driver first, but give this a go and let me know if it seems any better.
negative. all the same. btw, no need to upload binaries for me - i have a cvs checkout here.
Is there a sample video anywhere which demonstrates the problem ? And which version of xine are you using ?
not xine, but tvtime, from cvs head. the sample is any pal video signal, obviously. tvtime is using xv-shm for the overlay.
It happens with every program that uses Xv overlays and applications using offscreen areas. e.g. watching any video in mplayer and using gnome-terminal make the video flicker like mad (text appearing in strange colors in the video). With mplayer+Mozilla sometimes parts of the video appear in images and things like that. It's impossible to not reproduce it for me. :) I'll check the CVS version, somehow the download corrupted ELF object.
That's going to be tricky to replicate. I could use an easier method to demonstrate the problem without having to plug in a TV card.
Cristophe, I use mplayer and can't see any problems on my 845G or 865G here. So it may be caused by a specific video file or a certain version of the player which I need to replicate to be able to fix the problem.
with mplayer i can reproduce it, too, even though i had to try harder (i had to load a page in konqueror first, but then all kinds of "interesting" things happened to the pixmap caches). btw, your binary in a xfree 4.3+debian environment had the same effects, so its definitely the driver, not the framework.
Ossi, Can you attach a log file so I can see your setup ?
Created attachment 1850 [details] my log
Created attachment 1851 [details] Attempt to fix pixmap cache overwrite Another driver. Sorry about the binary again, it's just easier for me at the moment. See if this helps.
looks awfully well for the pixmap corruption part. :-) > Sorry about the binary again, it's just easier for me at the moment. > heh, i won't ask for _those_ circumstances ... ;)
btw, the interference with repainting is also gone, unsurprisingly. the destabilization effect is still reproducible, but became really hard to trigger after i started overclocking a few days ago, so i suppose this is a bandwidth problem, completely unrelated to the corruption issue. the "uninitialized pixels" issue persists as well. even if tvtime is misbehaving (is it?), it might be a good idea to let the driver initialize the "odd" pixels if the actual dimensions don't match the requested ones. anyway, i'll contact the tvtime author.
Bingo! Works for me. :) (not CVS but the last driver you posted) What was the problem?
(In reply to comment #14) > Created an attachment (id=1851) [edit] > Attempt to fix pixmap cache overwrite > > Another driver. Sorry about the binary again, it's just easier for me at the > moment. > > See if this helps. Unfortunately I can't load this drivers -- unresolved symbols. All of my modules are of the new dllloader (?) .so format; perhaps there's some kind of an issue loading a mixed env or such.
Committed fixes to CVS. Closing. Thanks for testing people.
thx. btw, i saw an xv alignment-related commit. was this supposed to fix the problem i have with tvtime? because it did not ...
If you mean the uninitialized pixels. I think that should be tvtime's job to initialize them, if it genuinely is 'uninitialized pixels' (which I'm not sure of). The application 'tvtime' should be painting the background anyway with the color key, so I'm not quite sure with what your saying, how it looks on-screen. I've tried to reproduce the problem with mplayer, but I can't. So some pictures would be helpful to see it. But did you contact the tvtime author ? Can you reproduce it with mplayer or another player that doesn't require a tv card ?
i don't think it is reproducible with any other prog i have here. the problem appears when one enables overscan in tvtime. i guess tvtime keeps the source image as-is, but changes the the xv parameters to show only a centered excerpt. i suspect the driver rounds up the excerpt dimensions, while tvtime assumes they are what it told them to be and does not update the supposedly "dead" area.
Then as tvtime created that window, I think it should be it's responsibility to clear the contents of it. Impacting the driver by having it clear it, is not advisable as it impacts other applications which don't need to.
that's not the point. my guess is, that the driver rounds up the dimensions of the visible excerpt. there is nothing tvtime is doing wrong if this is true. just for confirmation of the "same-source-theory": the "uninitialized" pixels are in fact pixels from previous scenes if the overscan was smaller before.
It's all guesses at the moment. Both of us don't know what tvtime is doing exactly without looking at their code. Why don't you try clearing in the drivers Xv code and see if you can come up with something that works for you. I can't replicate the problem here as I can't run tvtime to fix it.
ok, i had a look at the driver. i'm quite ill, so i don't feel able to grasp it fully or even come up with a patch. anyway, so far i think i was right; everything seems to be rounded up. in itself this is probably no problem, only the scale factor calculation should be done with the original (or even rounded down) size, so the overlay unit does not read beyond the user-supplied image data.
Can you actually take a picture of the problem. I'm trying to understand what you are saying but I can't visualize it.
i have no digicam at hand and screenshotting xv is obviously inherently impossible, so i make a bit ascii-art: -----------* -----------* -----------* -----------* the minuses are ok pixels, the asterisks are "left-over" pixels. note that these pixels are one column of pixels of the source image - at sufficiently large zoom levels, this results in multiple columns of visible trash. whether it appears at all, depends on whether the width of the excerpt is divisible by two (probably - this is still theory). the source image supplied by tvtime looks like this: ########## #--------# #--------# #--------# ########## the minuses are displayed pixels. the hashes are invisible pixels, which are not updated by tvtime. i think both the offsets and the width of the visible area can be odd, as tvtime calculates them in floating point. so the next step would be checking the math of the driver to verify that, indeed, the overlay unit can read beyond the user-supplied data when "something" is uneven.
How about you compile the driver with the XVIDEO debug flag set. At the top of the file i830_video.c you'll see it. Define it to 1 and grab a log. It's all too much theory without some hard data.
Created attachment 1888 [details] attempt to fix ossi's overlay problem Mmm, just been pouring over the code. I might have found something. Try this driver.
no, the new driver does not help. this comes from a frame where the effect is perfectly visible: OVERLAY_UPDATE srcPitch: 1536, dstPitch: 1536, size: 884736 I830DisplayVideo: 768x576 (pitch 1536) CompareOverlay: no differences Buffers: Y0: 0x00520000, U0: 0x006d0000, V0: 0x0073c000 Buffers: Y1: 0x005f8000, U1: 0x007a8000, V1: 0x00814000 pos: 0x00000000, size: 0x04000500 dst: 1280 x 1024, src: 564 x 423 xscale: 0.70c, yscale: 0.69c UV xscale: 0.386, UV yscale: 0.34e YUV422 OCMD is 0x00002005 OVERLAY_UPDATE hmm, even more strange things ahead: now i noticed, that the bottom line can show the same effect, but i only ever see it after switching vts (that is, when the memory is messed up beyond hope, i suppose) and with unreasonably big overscans (11% to each side). i also found out, that, in fact, tvtime always renders the entire frame, even if it does not display all of it. this sort of eliminates tvtime as the culprit.
Well, thanks for the info. Unfortunately until I can re-create it I can't fix it. I'll re-open the bug though so it doesn't get lost.
a program to reproduce it should be simple to create: - create a fixed-size source image, maybe a pal-style 768x576 yuy2. - create a fixed-size output window. the size doesn't matter too much as long as it is bigger than the source image. fullscreen helps to see it. - create a main loop that lets you interactively move and resize a visible region. - fill the source image with a color that is calculated from a hash of the coordinates and size of the visible region and XvPutImage it. now if you see some borders that have the old color after altering the visible region, you reproduced the bug.
Let me know when you've coded it and a sample YUY2 video to go with it.
is a qt app ok or must i learn Xlib/Xt first? :}
Qt's fine. Can you do a TV capture to produce a YUY2 sample movie too.
Created attachment 1889 [details] test program here it is. include path needs qt and x11, lib path the same and libs are -lqt-mt and -lXv. the top-left corner can be moved with ctrl-<arrow>, the bottom-right corner with alt-<arrow> and with unmodified arrows the entire window slides. space just circulates the colors and return repaints only. stdout shows the current rectangle. i can trigger the problem after pressing alt-left a few times. interestingly enough, every repaint is different.
O.k. The last of Ossi's reports here are fixed now, and I'll be committing my changes shortly.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.