Bug description: Since 2.10, I'm unable to use the Intel driver properly. Resolution keeps to 800x600 only. I was able to use a 1920x1200 resolution under this configuration (I didn't believe it the first time!). I know this is a somewhat old version of the Intel Graphics chip, but I'm unable to get newer hardware right now and this system works quite stable so I can recycle it in different tasks in the future. A dev from the IRC channel said me UMS for this chip wasn't removed, but in practical sense it seems to not work properly. Please say me what info do I need to submit for helping in matching the bug and solving it. System environment: -- chipset: intel 810e -- system architecture: 32-bit -- xf86-video-intel: 2.10 -- xserver: 1.7.5 -- mesa: 7.7 -- libdrm: 2.4.18 -- kernel: 2.6.32 -- Linux distribution: Archlinux -- Machine or mobo model: Dell OptiPlex GX110. Pentium III 733 MHz 128MB RAM -- Display connector: VGA Reproducing steps: Additional info: % xrandr Screen 0: minimum 640 x 480, current 800 x 600, maximum 800 x 600 default connected 800x600+0+0 0mm x 0mm 800x600 60.0* 56.0 640x480 60.0
Created attachment 33512 [details] dmesg
Created attachment 33513 [details] Xorg log
% xrandr --verbose Screen 0: minimum 640 x 480, current 800 x 600, maximum 800 x 600 default connected 800x600+0+0 (0x69) normal (normal) 0mm x 0mm Identifier: 0x68 Timestamp: 889624 Subpixel: unknown Clones: CRTC: 0 CRTCs: 0 Transform: 1.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 filter: 800x600 (0x69) 28.8MHz *current h: width 800 start 0 end 0 total 800 skew 0 clock 36.0KHz v: height 600 start 0 end 0 total 600 clock 60.0Hz 800x600 (0x6a) 26.9MHz h: width 800 start 0 end 0 total 800 skew 0 clock 33.6KHz v: height 600 start 0 end 0 total 600 clock 56.0Hz 640x480 (0x6b) 18.4MHz h: width 640 start 0 end 0 total 640 skew 0 clock 28.8KHz v: height 480 start 0 end 0 total 480 clock 60.0Hz
Forgot to mention it also fails under 2.9 and the actual Xorg version. % lspci | grep VGA 00:01.0 VGA compatible controller: Intel Corporation 82810E DC-133 (CGC) Chipset Graphics Controller (rev 03)
So what's the working version? Is this a kernel regression?
This issue is affecting a hardware component which is not being actively worked on anymore. Moving the assignee to the dri-devel list as contact, to give this issue a better coverage.
(==) intel(0): Will alloc AGP framebuffer: 8192 kByte (==) intel(0): Using gamma correction (1.0, 1.0, 1.0) (II) intel(0): Monitor0: Using default hsync range of 31.50-37.90 kHz (II) intel(0): Monitor0: Using default vrefresh range of 50.00-70.00 Hz 8 MiB: 1920x1080x4 just fits in (a few hundred bytes to spare). However, 1920x1200 does not. So you will need to increase Option "VideoRam" "16384". The second problem is that your monitor does not offer a 1920x1200 mode, so you will need to add that modeline to your xorg.conf. (Or check to see if a more recent xserver adds it back for). And if it did, it will be out of range of the default monitor settings so you will need to specify those above as well. All in all, I think the issue is that the sanity checks became slightly more paranoid and for the default fb allocation to be reduced. I don't think this is an outright bug per se, and the only fix would be to move i810 to KMS/GEM so that we can actually allocate and manage memory dynamically.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.