Created attachment 38317 [details] Xorg.0.log I have two LCD monitors attached to this machine (with a G45); the first has 1920x1080 pixels and the second has 1440x900 pixels. However, the second is being configured, by default, to 1600x1200, which obviously looks _awful_. The driver sets both display modes correctly in the console, only X has this problem. If I set PreferredMode on my second display to 1440x900 in xorg.conf, the default mode for the second monitor becomes correct, but now the first is set wrong (1280x1024). If I add PreferredMode to both displays, it works correctly. Here's the xrandr output, which shows the correct size at the top of each mode list: Screen 0: minimum 320 x 200, current 3520 x 1200, maximum 8192 x 8192 HDMI1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 510mm x 290mm 1920x1080 59.9*+ 60.0 60.0 1680x1050 60.0 1280x1024 75.0 60.0 1152x864 75.0 1024x768 75.1 60.0 832x624 74.6 800x600 75.0 60.3 56.2 640x480 75.0 60.0 720x400 70.1 VGA1 connected 1600x1200+1920+0 (normal left inverted right x axis y axis) 408mm x 255mm 1440x900 59.9 + 1600x1200 60.0* 1400x1050 60.0 1280x1024 75.0 60.0 1280x960 60.0 1152x864 75.0 60.0 1280x720 60.0 1024x768 75.1 70.1 60.0 832x624 74.6 800x600 72.2 75.0 60.3 56.2 640x480 72.8 75.0 66.7 60.0 720x400 70.1 DP1 disconnected (normal left inverted right x axis y axis) I'm running latest git everything.
It's insane but the EDID does include that mode amongst its supported timings. Prior to 1.9 it would have been rejected as its pixel clock of 162.5MHz is larger than the stated max pixel clock of 160MHz, but with 1.9 we allowed a 5MHz fuzz (as the smallest significant value in the EDID is 10MHz to allow this sort of rounding error to pass). I'm not sure what the best approach to deal with an incorrect EDID like this is. But the console doesn't use that mode? Even weirder... Nick, can you grab a drm.debug=6 dmesg so we can see what modes the kernel is choosing from?
The console _does_ use 1440x900, which is correct.(In reply to comment #1) > It's insane but the EDID does include that mode amongst its supported timings. > Prior to 1.9 it would have been rejected as its pixel clock of 162.5MHz is > larger than the stated max pixel clock of 160MHz, but with 1.9 we allowed a > 5MHz fuzz (as the smallest significant value in the EDID is 10MHz to allow this > sort of rounding error to pass). > > I'm not sure what the best approach to deal with an incorrect EDID like this > is. Is it really that incorrect? It's weird that 1600x1200 is listed, but aside from being very ugly due to squashing, the display works. The xrandr output seems well aware that 1440x900 is the preferred mode, judging by the + beside it. > But the console doesn't use that mode? Even weirder... The console _does_ use 1440x900 (the LCD native mode), which is correct. Or are you surprised about the difference? > Nick, can you grab a drm.debug=6 dmesg so we can see what modes the kernel is > choosing from? I can't check right at this moment, but I'll take a look in a couple hours.
It does surprise me because the mode choosing algorithm is superficially meant to be the same so that there is no mode switch when starting X. And the change I made to X's parsing of the EDID was to bring it into line with the kernel, so it too should be including 1600x1200 when deciding which mode to boot into. No LCD panel should advertise anything but the native resolution. </rant> ;-)
In this case, it looks like we should be using the Preferred mode for the secondary display and not its largest.
-- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/xserver/issues/218.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.