Bug 16033 - [Randr12] DPI calculation ludicrously incorrect
Summary: [Randr12] DPI calculation ludicrously incorrect
Status: RESOLVED MOVED
Alias: None
Product: xorg
Classification: Unclassified
Component: Server/General (show other bugs)
Version: git
Hardware: x86-64 (AMD64) Linux (All)
: medium normal
Assignee: Xorg Project Team
QA Contact: Xorg Project Team
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2008-05-20 02:32 UTC by Christopher James Halse Rogers
Modified: 2018-12-13 22:20 UTC (History)
3 users (show)

See Also:
i915 platform:
i915 features:


Attachments
Xorg log, showing DPI set to (230, 254) (145.89 KB, text/x-log)
2008-05-20 02:32 UTC, Christopher James Halse Rogers
no flags Details

Description Christopher James Halse Rogers 2008-05-20 02:32:01 UTC
Created attachment 16646 [details]
Xorg log, showing DPI set to (230, 254)

It seems that the DPI calculation is using the Virtual line to calculate the DPI.  Since this is substantially larger than my laptop's screen resolution this results in a DPI value slightly under double the true DPI!

This may well not be a nouveau bug; I've had a grep through the source and it seems that nouveau is just setting the DPI received from the server.
Comment 1 Frans Pop 2008-08-15 14:48:40 UTC
I'm seeing the same issue on my new laptop (HP 1510p running Debian Lenny). Initially DPI was getting set to (96, 96), which is too low for my laptop's LCD screen. Apparently because the laptop's display does not provide size info.

From the Xorg log:
(II) intel(0): Output VGA disconnected
(II) intel(0): Output LVDS connected
(II) intel(0): Output TV disconnected
(II) intel(0): Output LVDS using initial mode 1280x800
(II) intel(0): Monitoring connected displays enabled
(II) intel(0): detected 512 kB GTT.
(II) intel(0): detected 7676 kB stolen memory.
(==) intel(0): video overlay key set to 0x101fe
(==) intel(0): Will not try to enable page flipping
(==) intel(0): Triple buffering disabled
(==) intel(0): Intel XvMC decoder disabled
(==) intel(0): Using gamma correction (1.0, 1.0, 1.0)
(==) intel(0): DPI set to (96, 96)


I had already been playing with a second monitor connected for which I needed the "Virtual" option. Using info from the Debian wiki I added the correct display size setting for LVDS, after which my xorg.conf had:

Section "Device"
        Identifier      "Intel GM965 Video Controller"
        Option          "Monitor-LVDS"  "Internal Panel"
        Option          "Monitor-VGA"   "External VGA Monitor"
EndSection

Section "Monitor"
        Identifier      "Internal Panel"
        DisplaySize     261 163         # 10.2x6.3"
EndSection

Section "Monitor"
        Identifier      "External VGA Monitor"
EndSection

Section "Screen"
        Identifier      "Default Screen"
        Monitor         "Internal Panel"

        Subsection "Display"
                Virtual         3000 2000
        EndSubSection
EndSection

After restarting the X server this resulted in:
(II) intel(0): Output VGA disconnected
(II) intel(0): Output LVDS connected
(II) intel(0): Output TV disconnected
(II) intel(0): Output LVDS using initial mode 1280x800
(II) intel(0): Monitoring connected displays enabled
(II) intel(0): detected 512 kB GTT.
(II) intel(0): detected 7676 kB stolen memory.
(==) intel(0): video overlay key set to 0x101fe
(==) intel(0): Will not try to enable page flipping
(==) intel(0): Triple buffering disabled
(==) intel(0): Intel XvMC decoder disabled
(==) intel(0): Using gamma correction (1.0, 1.0, 1.0)
(**) intel(0): Display dimensions: (261, 163) mm
(**) intel(0): DPI set to (291, 311)

Given that only the LVDS is connected the DPI should be (125, 126) based on the LVDS mode and the display dimensions. The actual DPI is way too high and can be shown to be calculated from the (dummy) Virtual mode (for example: 3000 / (261 / 10 / 2.54) = 291).

Even if multiple displays are connected the Virtual setting is not a useful value as it only defines the _maximum_ area that can be used by combining monitors and not the _actual_ area.

Note that if I also add the external VGA monitor in the Screen section, the DPI will default to (96, 96) again. But if I list them in reverse order (external VGA first and LVDS second), I get the above values again.
Seems like the display dimensions of the last monitor are used, regardless of what is actually connected.

Cheers,
FJP
Comment 2 Pekka Paalanen 2009-09-23 05:12:14 UTC
Does this problem still occur with the latest released xorg-server?

If it does, maybe this bug should be re-targeted to xorg-server?
Comment 3 Elvis Pranskevichus 2010-01-11 11:16:08 UTC
I can confirm this happening on xorg-server-1.7.4 on my system.  I use external monitor in a randr setup and the physical dimensions reported in xdpyinfo are totally wrong:

dimensions: 4000x1600 pixels (331x207 millimeters)
resolution: 307x196 dots per inch

I have DisplaySize set for both monitors and apparently the server takes only the first one.

Now, I've hit this trying to remedy a regression from xorg-server-1.6.  Previously, physical dimensions were detected correctly and a correct DPI was set.  Now (with 1.7.4) it just uses 96x96.  xrandr reports correct dimensions in all cases.

Oh, and I'm on xf86-video-intel, so this is almost definitely not nouveau-specific.
Comment 4 Marcin Slusarz 2010-07-23 16:37:32 UTC
probably not nouveau bug, changing component to Server/general
Comment 5 Steve Graham 2014-11-01 18:35:54 UTC
This is still a bug and still present. My X server (Intel Corporation Mobile GM965/GL960) reports:
X.Org X Server 1.15.99.904 (1.16.0 RC 4)
Release Date: 2014-07-07

What seems to be happening on my system is that the pixel dimensions are taken from the Virtual statement in the "Screen" section of xorg.conf but the millimetre dimensions are taken from the DisplaySize line of the LAST "Monitor" section.

In my case, I'm only seeing one application affected, me-tv. When I checked its source, it's calculating the pixel aspect ratio by using the macros DisplayWidth, DisplayWidthMM, DisplayHeight and DisplayHeightMM. These return the values from xorg.conf as mentioned above, and make the app calculate a non-square pixel shape and show TV images stretched. (Oh, xdpyinfo returns the wrong DPI too.)

As a workaround, I've simply entered fake values for DisplaySize for the second monitor, and this makes me-tv work correctly and xdpyinfo return a correct DPI. No other application seems to be affected by the workaround.
Comment 6 GitLab Migration User 2018-12-13 22:20:07 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/xserver/issues/371.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.