I've tried the latest Xorg server in debian unstable i386, and it suffers major loss of functionality, as described in the bug report below, that I copied from my debian bug report -- see Bug 362977 http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=362977 Package: xserver-xorg-video-i810 Version: 1:1.5.1.0-2 Severity: important Since I upgraded my Toshiba Libretto U105 to use the new xserver-xorg from unstable, Xorg will shift the graphical display contents by about 25 lines downwards and 768 pixels rightwards, wrapping lines around. The result is not quite usable. All in all that's 32768 pixels, at 4 bytes per pixel, a 128KiB offset. This upper section of the screen is originally filled with contents of the previous VT, and slowly being filled by regular patterns, about 26 pixels at a time, about one burst per second. In the X log, I see notably: (--) PCI:*(0:2:0) Intel Corporation 82852/855GM Integrated Graphics Device rev 2, Mem @ 0xd8000000/27, 0xd0000000/19, I/O @ 0xeff8/3 (--) PCI: (0:2:1) Intel Corporation 82852/855GM Integrated Graphics Device rev 2, Mem @ 0x20000000/27, 0x2a000000/19 ... (--) I810(0): Virtual size is 1280x768 (pitch 1280) (**) I810(0): *Built-in mode "1280x768" ... (II) I810(0): Attempting to use 61.11Hz refresh for mode "1280x768" (862) ... (==) Depth 24 pixmap format is 32 bpp (II) I810(0): Rotating to 0 degrees (II) I810(0): initializing int10 (II) I810(0): Primary V_BIOS segment is: 0xc000 (II) I810(0): VESA BIOS detected (II) I810(0): Allocated 128 kB for the ring buffer at 0x0 (II) I810(0): Allocating at least 256 scanlines for pixmap cache (II) I810(0): Initial framebuffer allocation size: 7680 kByte (II) I810(0): Allocated 4 kB for Overlay registers at 0x7fff000 (0x1491b000). (II) I810(0): Allocated 64 kB for the scratch buffer at 0x7fef000 (II) I810(0): 0x82070d0: Memory at offset 0x00020000, size 7680 kBytes (II) I810(0): 0x820b748: Memory at offset 0x00000000, size 0 kBytes (II) I810(0): 0x820b940: Memory at offset 0x00000000, size 0 kBytes (II) I810(0): 0x820b5fc: Memory at offset 0x00000000, size 128 kBytes (II) I810(0): 0x8207110: Memory at offset 0x07fef000, size 64 kBytes (II) I810(0): 0x820b998: Memory at offset 0x07fff000, size 4 kBytes (WW) I810(0): PRB0_CTL (0x0000f001) indicates ring buffer enabled As far as I can tell, mode switching happens only through the BIOS, and the attempt to offset the framebuffer away from the start of the video memory may well be ill-fated. In any case, this ring buffer allocation seems to me as being premature and causing the bug. I haven't kept logs from previous working versions of the server; if the information may help you, I'll send it when I have successfully downgraded my xserver (ouch, a whole lot of stuff needs to be co-downgraded). [ François-René ÐVB Rideau | Reflection&Cybernethics | http://fare.tunes.org ] The risk is that if, one day, machines become intelligent, we mightn't be mentally equipped to notice they are. -- Tirésias, in J.-P. Petit, "A quoi rêvent les robots?" -- System Information: Debian Release: testing/unstable APT prefers unstable APT policy: (500, 'unstable'), (500, 'testing') Architecture: i386 (i686) Shell: /bin/sh linked to /bin/bash Kernel: Linux 2.6.15.3-blefuscu Locale: LANG=en_US.iso-8859-1, LC_CTYPE=en_US.iso-8859-1 (charmap=ISO-8859-1) (ignored: LC_ALL set to en_US.iso-8859-1) Versions of packages xserver-xorg-video-i810 depends on: ii libc6 2.3.6-7 GNU C Library: Shared libraries ii xserver-xorg-core 1:1.0.2-4 X.Org X server -- core server xserver-xorg-video-i810 recommends no packages. -- no debconf information
I ran the 7.0.0 server with the driver from the 6.9.0 server, and it works fine. The 7.0.0 module (non-working) is 1.5.1, and the 6.9.0 module (working) is 1.4.1. The 1.4.1 module also allocates the ring buffer first, but somehow manages to correctly tell the video card about the framebuffer being offset 128KiB. I can do more tests if you need them to locate the bug...
you should probably either get the 1.5.2 version of grab the git repository and build yourself.
Do you have a i686 binary module to test the bug? I lack time to setup a rebuild at this time... If 1.5.2 is known to fix this problem, I suppose you can close this bug. -- but I still appreciate an updated binary driver.
You can get a new module from http://www.fairlite.demon.co.uk/intel.html
The ABI of this module is not compatible with my current server from the latest debian (7.0.22), and so I cannot test it. I tried to get instructions for compiling a whole server, but between git and cvs and a huge list of modules, I don't know where to start. Is it possible that you should either post clear directions on what to compile (and how), or publish a full tarball of related X server binaries to test in a chroot?
Use the -ignoreABI flag to the Xserver.
Created attachment 6180 [details] error output of X server with i810_drv 1.6.0 on Libretto U105 I finally tried the module with said server flag. Sorry for the delay. It is a definite improvement in two ways: (1) the zone of lost screen is much reduced (something between 10 and 20 lines) (2) it is now aligned with the line length, so there is no horizontal shift (3) the patterns only eventually fill half of that space. Relevant lines in X error output: (WW) module ABI major version (1) doesn't match the server's version (0) (EE) I810(0): [dri] DRIScreenInit failed. Disabling DRI. The X server error output ends abruptly, but more information is logged in /var/log/Xorg.1.log (attached -- and thanks to strace -e open for finding it).
From the look of that log, you've got a nicely broken Video BIOS. Try adding. Option "MonitorLayout" "NONE,LFP" to your Device section.
Created attachment 6181 [details] X log with MonitorLayout NONE,LFP Here's the output with the option you recommended. Only the few top lines were visible, and with the cursor double sized, so the whole mode switching must have failed. The bottom of the screen was monochrome, black or white depending on my moving the cursor. With "LFP,NONE" instead, the whole screen remained black. Sorry about my Video BIOS being broken -- I don't think I can change it, and the old driver knew how to work around that brokenness, so I suppose it's possible...
BTW, thanks a lot for your support and prompt replies!
Do you have a CRT plugged in when you boot up ?? If so, can you disconnect it ? And try again.
If that doesn't work, you may want to try getting the modesetting branch from the git repository for the xf86-video-intel driver which does away with the VideoBIOS to set the modes. It should certainly help you in your case.
No CRT plugged in at bootup or at anytime. May I similarly request a i386 binary driver for the modesetting branch? (What new feature confuses the BIOS that wasn't used in previous versions? Can't we allocate those buffers *after* the framebuffer instead of *before*?)
Created attachment 6570 [details] X log I am suffering from the same problem (I guess) since the Debian upgrade of the xorg xserver in unstable. I experienced the displaced framebuffer and could not get it to work ever since. I compiled the latest git version of the xserver and the intel driver module (main branch and modesetting branch). The attached log shows the result when I try the modesetting branch (but it's the same for the main branch intel driver). It seems that the driver does not detect any modes and aborts. I'd be very grateful for suggestions. Thanks!
Short update: the bug is still there with driver 1.6.5, no change since last time (i.e. 10 to 20 lines lost on top of the screen, but now it's aligned to a 1280-pixel line, so what remains of the screen is kind of usable). Once again, can't we just allocate those scratch areas (or whatever they are) after the framebuffer? That would make them invisible.
Like I said in comment #12 - try the modesetting branch.
(In reply to comment #14) > Created an attachment (id=6570) [edit] > X log > > I am suffering from the same problem (I guess) since the Debian upgrade of the > xorg xserver in unstable. I experienced the displaced framebuffer and could not > get it to work ever since. > > I compiled the latest git version of the xserver and the intel driver module > (main branch and modesetting branch). The attached log shows the result when I > try the modesetting branch (but it's the same for the main branch intel > driver). It seems that the driver does not detect any modes and aborts. I'd be > very grateful for suggestions. Thanks! Try adding Option "NoDDC" and see if that helps.
Doesn't help much. It possibly reduces the size of the lost zone further, but I haven't measured. There still is a lost zone, anyway.
post a log.
and a screenshot of the problem would be useful too.
Created attachment 7170 [details] Log with the Option "NoDDC" "True" commented out Same difference
Created attachment 7171 [details] no visibl diffeence
Created attachment 7172 [details] requested picture Captured with fbgrab during an attempt to run X with the framebuffer enabled. The X frambuffer is shifted downwards with the same top on top. The bottom of the picture seems hosed too, but on screen it shows great.
Comment on attachment 7172 [details] requested picture Only the top of this picture is valid. It shows what the top of the screen looks like when I'm using version 1.6 of the driver later (1.5 is even worse, since this stuff is not even line-aligned). The bottom of the screen (below the 16 or so lines) is fine for me, but I could not capture it with fbgrab. The bug also happens when I run w/o kernel framebuffer.
Those logs are not with the modesetting branch. Try again.
Darn, I had just grabbed the binary from debian. Where can I get binaries from the modesetting branch? Or a compilable tarball of the sources?
you'll have to checkout the git repo. look at the wiki on wiki.freedesktop.org and there's information there about getting the git repos. If you've any specific problems with retrieving the repo then use the xorg mailing list for help.
Getting the source was easy enough: it was the example given in http://wiki.x.org/wiki/GitPage git clone git://anongit.freedesktop.org/git/xorg/driver/xf86-video-intel However, once there, I found that I couldn't get it working with either automake 1.4 or 1.9. With 1.4, I get an early error message about AM_CFLAGS. Upgrading to 1.9, I get up to a point where ./configure borks: ./configure: line 20685: syntax error near unexpected token `XINERAMA,' ./configure: line 20685: `XORG_DRIVER_CHECK_EXT(XINERAMA, xineramaproto)' In case it matters, I'm using debian testing as my base system. What should I do from there to get the thing to compile? Pointers to proper Wiki pages welcome...
As in the previous comment. Any build issues like this should be discussed on the mailing lists.
Although I will say it just sounds like you need to install xineramaproto from the debian packages.
Also, once you've checked out the git repo - really make sure you are using the modesetting branch by doing.... git checkout modesetting before doing any compilation.
OK, after installing a lot of other packages, I managed to compile things. git checkout modesetting was almost instantaneous and with no output. Hope it worked. Which object file should I use? src/.libs/i810_drv.so? When I use it, I get the following error when the server tries to load it: dlopen: /usr/lib/xorg/modules/drivers/i810_drv.so: undefined symbol: I830xf86DefaultModes Am I doing something wrong? Do I first have to recompile the whole server or something like that? What exactly must I compile from which source, and what can I leave from my debian system?
Check that it's built all the .c files to .o as it sounds like i830_xf86Modes.c isn't getting built.
Woohoo! The modesetting branch works perfectly for me, with or without option NoDDC. Congratulations! Does that mean I can now use lots of weird resolutions and let the hardware do the stretching for me? Or does the XV and/or GL2 support make this notion obsolete anyway? Anyway, the battle was hard, but you've made it worthwhile. Thanks a lot, and sorry for being so annoying. Question remaining: when will the branch become mainline and be released? NB: the bug with I830xf86DefaultModes was due to "make clean" being not enough. I needed to make distclean or something and regenerate the Makefile's.
Created attachment 7214 [details] Binary driver as compiled from the modesetting branch on debian testing i386 For the sake of other testers too lazy to go through all the process and trusting enough to download a binary from the internet, here's the driver I compiled. (Not that compiling unaudited source code is any less trusting. The real downside is that my binary won't automagically include new fixes and improvements in the source since I compiled it.) NB: I needed to make real clean, delete config.guess and rerun the whole autogen.sh because I had compiled before I git checkout modesetting. PS: you might wanna strip the .so for a slight space improvement. Left unstripped for debugging purposes. Kisses to Alan!
Created attachment 7215 [details] X log No change here, I built the latest driver from the modesetting branch, but I still get basically the same log. I disabled most extensions in my xorg.conf and added NoDDC, but that didn't help either. Any suggestions? Thx
As a downside to the modesetting driver, I find that (on my machine) it's not possible anymore to switch back to text mode. Whether with the VGA text mode or the intelfb driver, the screen is garbled when you switch to text mode. Happily, you can switch back to X.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.