Updated everything to released individual tarballs. Using intel driver from git master, kernel from git, KMS enabled. On 1.6.3 (correct size): (II) intel(0): Setting screen physical size to 303 x 190 On 1.7.0rc0 (wrong size): (II) intel(0): Setting screen physical size to 381 x 238 Both show the same DPI at start (==) intel(0): DPI set to (96, 96). The funny thing is that on 1.7 xrandr reports proper size: [arekm@t400 ~]$ xrandr|grep LV LVDS1 connected 1440x900+0+0 (normal left inverted right x axis y axis) 303mm x 190mm but xdpyinfo doesn't: [arekm@t400 ~]$ xdpyinfo|grep dimens dimensions: 1440x900 pixels (381x238 millimeters) Font is too small so X uses that wrong dimention. I switched back to 1.6 (by changing just X server + intel driver rebuild, all other xorg pakckages/kernel were the same) and the issue didn't occur. Looks that this is some xserver 1.7 problem. Using [arekm@t400 ~]$ xrandr --dpi 120/LVDS almost fixes the problem [arekm@t400 ~]$ xdpyinfo|grep dimens dimensions: 1440x900 pixels (304x190 millimeters) X isn't started with -dpi option: root 8931 7.7 1.2 176900 48948 tty7 Ss+ 13:41 0:36 /usr/bin/X -nolisten tcp :0 vt7 -auth /var/run/xauth/A:0-WgaEim Fix for xserver welcome.
Created attachment 29210 [details] X log from 1.6.3 xserver (working setup)
Created attachment 29211 [details] X log from 1.7.0 rc0 (not working properly setup)
Created attachment 29212 [details] xorg.conf used
fff00df94d7ebd18a8e24537ec96073717375a3f can you try reverting that from 1.7 and seeing if it helps?
Reverting that patch (+ small build fix) makes my setup working again: $ xdpyinfo|grep dimens dimensions: 1440x900 pixels (303x190 millimeters) Now am I supposed to configure this manually via xorg.conf as commit log says?
The 'screen size' as reported in the core protocol now respects the DPI value given by the user or config file and ignores the actual monitor DPI of any connected monitor. The real monitor size is reported through the RandR extension if you really need to know the physical size of the display. This is intentional and follows the practice seen in many other desktop environments where the logical DPI of the screen is used as an application and font scaling factor. If you don't like it, you're welcome to configure the X server to taste; we haven't removed any of those knobs. And, you can even use xrandr to change things on the fly.
"now respects the DPI value given by the user or config file" - ok but I don't have DPI set in config and don't have DPI set in command line. I would expect some sane/matching my LVDS value to be used by default in such case. Also xrandr --dpi 120/LVDS _almost_ works because it's wrong by 1mm. Will try to set size via config then.
> The 'screen size' as reported in the core protocol now respects the DPI value > given by the user or config file and ignores the actual monitor DPI of any > connected monitor. Hmmh but what's the point? Over the few past years, my xorg.conf has completely disappeared, either because the config is done somewhere else, or because all the settings are correctly autodetected by X. Why having to revert that and specify display size (which the user usually don't really know anyway, while the server does)? > The real monitor size is reported through the RandR extension if you really > need to know the physical size of the display. Yeah so the user needs to use X to get his display size, then edit xorg.conf to tell X the display size. I find that a bit painful and inconsistent > This is intentional and follows the practice seen in many other desktop > environments where the logical DPI of the screen is used as an application and > font scaling factor. Hmhm, I don't get it. I'm not really comfortable with all that, but if my screen is a high dpi one and my OS supports it, why would I need to do as other crappy os do, and stick with a low resolution, 96 dpi screen? > If you don't like it, you're welcome to configure the X server to taste; we > haven't removed any of those knobs. And, you can even use xrandr to change > things on the fly. That's good, but I really don't know why it's necessary. Wasn't autodetection working correctly for the majority of users? Couldn't other people use the DisplaySize stuff? Cheers,
*** Bug 25897 has been marked as a duplicate of this bug. ***
(In reply to comment #6) > This is intentional and follows the practice seen in many other desktop > environments where the logical DPI of the screen is used as an application and > font scaling factor. This is definitely a good idea, and it's what many X applications already did. For example, I can tell Firefox to lay out text in a column that is 16cm wide, or I can tell urxvt to use a 9pt font (9pt is 1/8 of an inch). However, these units are only meaningful if the DPI is correct. If X is configured to assume 96dpi, then an application which wants to display something 16cm wide will translate that to 605 pixels. But when it is then output to the attached 129dpi display device, the result is only 12cm wide. > If you don't like it, you're welcome to configure the X server to taste; we > haven't removed any of those knobs. And, you can even use xrandr to change > things on the fly. I don't mind having to configure the X server to get the desired behaviour. However, after perusing the man pages, I can't find the knob which restores the earlier behaviour of computing the DPI value automatically based on the geometry of the attached monitor. The best solution I've come up with so far (save for patching the server) is adding xrandr --fbmm `xrandr | sed -n '/ connected / {s/.* \([0-9]\+\)mm x \([0-9]\+\)mm/\1x\2/p;q}'` to my startup scripts. So yes, the actual size information is available, and yes, the resolution tuned on the fly, but why is pulling a meaningless DPI value out of thin air the default?
(In reply to comment #8) > > The 'screen size' as reported in the core protocol now respects the DPI value > > given by the user or config file and ignores the actual monitor DPI of any > > connected monitor. > > Hmmh but what's the point? Over the few past years, my xorg.conf has completely > disappeared, either because the config is done somewhere else, or because all > the settings are correctly autodetected by X. Why having to revert that and > specify display size (which the user usually don't really know anyway, while > the server does)? > > > The real monitor size is reported through the RandR extension if you really > > need to know the physical size of the display. > > Yeah so the user needs to use X to get his display size, then edit xorg.conf to > tell X the display size. I find that a bit painful and inconsistent > > > This is intentional and follows the practice seen in many other desktop > > environments where the logical DPI of the screen is used as an application and > > font scaling factor. > > Hmhm, I don't get it. I'm not really comfortable with all that, but if my > screen is a high dpi one and my OS supports it, why would I need to do as other > crappy os do, and stick with a low resolution, 96 dpi screen? > > > If you don't like it, you're welcome to configure the X server to taste; we > > haven't removed any of those knobs. And, you can even use xrandr to change > > things on the fly. > > That's good, but I really don't know why it's necessary. Wasn't autodetection > working correctly for the majority of users? Couldn't other people use the > DisplaySize stuff? > > Cheers, +1 My current screen is a 13.3" 1440x900 LCD, resulting in about 127DPI. I was like "WTF has happend???" when I upgraded to 1.7 and saw everything in 96DPI. Why ignore a valid information we already have? When someone wants to override this, he perfectly can do so. I don't see the point why everyone who wants to have the correct behavior has to configure it by telling xorg "you have the information, use it", it should be "the information you have is wrong, please use X instead" as the old behavior was... Oh, and btw, which "other desktop environments"? DPI is dots per inch, there can't be any other interpretation than "how many pixels/dots are used for one inch on screen". Thanks Evgeni
*** Bug 26194 has been marked as a duplicate of this bug. ***
(In reply to comment #6) > The 'screen size' as reported in the core protocol now respects the DPI value > given by the user or config file and ignores the actual monitor DPI of any > connected monitor. I've read the rest of the comments here and I'm still not sure I understand. What DPI value given by the user or config file? I haven't specified DPI *anywhere* (my xorg.conf file is empty except for 4 lines to set the video driver to 'nouveau' or 'nvidia'). And yet, xdpyinfo reports 96dpi, when the real resolution is 112dpi. This sounds incredibly inconsistent and not very useful. You're saying I'm forced to either override Xfce's detected DPI (which will only "unbreak" GTK apps, and nothing else), or add entries to my xorg.conf file to set the display size? That makes no logical sense to me. *Please* tell me I'm misunderstanding something here. (I suspect I may be: when I use nouveau, I get an incorrect DPI from xdpyinfo, even if the logging in Xorg.0.log prints out the correct display size and DPI. When I use nvidia, my DPI is correct everywhere.)
(In reply to comment #13) > What DPI value given by the user or config file? I haven't specified DPI > *anywhere* Me too.
I do not see any voting possibility here so can only add: +1/me too.
Same here, upgrade from 1.6 to 1.7, all my gtk apps started writing much smaller, it hurts the eyes...
Created attachment 33081 [details] [review] Add "DontLie" server flag, to encourage honesty. In case anyone's interested, here's the patch I've been applying which adds a so-called "knob" to restore the original behaviour. Maybe the flag can be extended to avoid other lies in the future. xfree86: Add DontLie server flag. Since commit fff00df94d7ebd18a8e24537ec96073717375a3f, RandR 1.2 drivers lie about the resolution of the attached screen by default. When the reported resolution is wrong, fonts and other UI elements that use physical units are not sized correctly. This patch adds a new server flag, DontLie, which encourages the server to be honest by default. Signed-off-by: Nick Bowler <nbowler@draconx.ca>
I'm sorry, but this really looks like an uncalled for change to me. >This is intentional and follows the practice seen in many other desktop >environments where the logical DPI of the screen is used as an application and >font scaling factor. http://en.wikipedia.org/wiki/Font_size A point is an unit of measure, 0.353 mm. If I set a character to be 10 points high, it has to be 3.5mm high on a screen. I understand this could be not the most natural behaviour on projectors, but the vast majority of people use screens, there's no need to cause them troubles. (I don't even know if projectors report DPI in their Edid anyway.) Unless explicitly overridden, X should respect Edid. Fiddling gratuitously with the DPI makes default configurations almost unusable on high resolution screens (fonts are rendered too small). It screws up the 1:1 display of documents and images; and I suppose it messes with input devices like tablets and touchscreens. Even the default of 96dpi doesn't make sense, this resolution is getting less and less common every day. Users To further the annoyance, I haven't found a way to ovverride the 96dpi default in xorg.conf: DisplaySize gets ignored, and so does Option "DPI". Right now I'm stuck with xrandr --dpi in .xsessionrc - not what I'd call user friendly. Please reconsider this change in behaviour. What bug was it supposed to fix? Regards, Luca
(In reply to comment #18) > http://en.wikipedia.org/wiki/Font_size > A point is an unit of measure, 0.353 mm. > If I set a character to be 10 points high, it has to be 3.5mm high on > a screen. Just to clarify, setting the font size to 10pt defines the size of an "em"; the exact meaning of which depends on the selected font (it might not exactly correspond to the height of a character on screen). > I understand this could be not the most natural behaviour on > projectors, but the vast majority of people use screens, there's no > need to cause them troubles. The main problem here is that our method of specifying font sizes is not well suited for devices such as projectors or TVs because it does not take the viewing distance into account. However, lying about the DPI doesn't actually improve the situation. > Fiddling gratuitously with the DPI makes default configurations almost > unusable on high resolution screens (fonts are rendered too small). And when it doesn't make fonts unreadable, it makes them ugly. > Even the default of 96dpi doesn't make sense, this resolution is > getting less and less common every day. I have owned exactly one display device in my lifetime with this resolution: a 17" LCD monitor with 1280x1024 pixels. Most of my CRTs have higher resolution, and most of my other "external" LCDs have lower. My laptops have significantly higher resolution than all my other devices. So from my personal experience, 96 is almost always the wrong choice. The number seems to have come out of nowhere and makes little sense as a default. > Please reconsider this change in behaviour. > What bug was it supposed to fix? The commit message says Reporting the EDID values in the core means applications get inconsistent font sizes in the default configuration. This makes no sense, since font sizes are consistent only when the DPI correctly reflects reality! This change *causes* font sizes to be inconsistent.
(In reply to comment #19) > 96 is almost always the wrong > choice. The number seems to have come out of nowhere and makes little > sense as a default. It was chosen in order to make display of web pages using Xorg more consistent with the way they get displayed on Windows, which by default assumes 96. http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx explains the origin of 96.
(In reply to comment #19) > (In reply to comment #18) > Reporting the EDID values in the core means applications get > inconsistent font sizes in the default configuration. > This makes no sense, since font sizes are consistent only when the DPI > correctly reflects reality! This change *causes* font sizes to be > inconsistent. Actually, "correct" DPI only theoretically causes consistency. As a practical matter, the differing number if device pixels required to generate a glyph of some particular physical size typically results in an apparent difference when compared to the same physical size at a different DPI. This is because with the most commonly used fonts each unique pixel size is a physically unique design, not a simple magnification or demagnification of a single design. This is most commonly noticeable in sizes in the vicinity of 16px to 20px. At some point in this range, stem weight changes from 1.00px to 2.00px. Without any applied font smoothing, this is always quite clear. When various smoothing effects are applied, the difference is generally less obvious, but does produce an apparent inconsistency.
(In reply to comment #21) > Actually, "correct" DPI only theoretically causes consistency. As a > practical matter, the differing number if device pixels required to > generate a glyph of some particular physical size typically results in > an apparent difference when compared to the same physical size at a > different DPI. With correct DPI, 9pt fonts are readable on all my display devices. I can take a ruler and measure the glyph size, and it is effectively the same. With incorrect, 96 DPI, 9pt fonts are too small on my laptop to be legible without a magnifying glass. On a display with non-square pixels (common on CRTs), the problems are even more pronounced. While there can obviously be rasterisation differences (the higher resolution display will look better), this is not merely a theoretical issue. (In reply to comment #20) > It was chosen in order to make display of web pages using Xorg more consistent > with the way they get displayed on Windows, which by default assumes 96. > http://blogs.msdn.com/fontblog/archive/2005/11/08/490490.aspx explains the > origin of 96. Firefox has the ability to render based on a fixed DPI (which indeed solves the problem with broken websites not rendering correctly). I'm not sure why we need the feature in X.org as well. Of course, enabling that feature makes it impossible to read any text on web pages, but I suppose that's only a _minor_ inconvenience... Is the goal of X.org to be bug for bug compatible with Microsoft Windows?
(In reply to comment #22) > While there can obviously be rasterisation differences (the higher > resolution display will look better), this is not merely a theoretical > issue. I guess I could have done a better job of making my point. Basically what I meant was because of rasterization differences, there is less practical consistency than ideal, and thus practice in fact falls short of theoretical. Still, the results aren't all that different from ideal, and using actual DPI in practice is much better than an inane assumption of 96 without regard to actual. > Firefox has the ability to render based on a fixed DPI (which indeed > solves the problem with broken websites not rendering correctly). By default, FF uses the greater of actual DPI as reported to it by the environment, or 96. As a practical matter this infrequently makes much difference on web sites, as sites styling in such DPI-dependent physical measurements as mm, in or pt aren't particularly common, and FF's defaults are set in px, which is not affected by DPI. > I'm not sure why we need the feature in X.org as well. IMO Xorg has no business making that inane assumption. Nevertheless, there is some rationale to do it that results in some distros doing it, e.g. http://wiki.mandriva.com/en/2009.0_Notes#Font_size_and_physical_DPI
(In reply to comment #23) > IMO Xorg has no business making that inane assumption. Nevertheless, there is > some rationale to do it that results in some distros doing it, e.g. > http://wiki.mandriva.com/en/2009.0_Notes#Font_size_and_physical_DPI While I cannot complain about the decisions of a distribution that I don't use (maybe Mandriva users appreciate this), I must address some of the points on that wiki page: | ... bizarre results when the DPI detection system fails Indeed, such as when it decides for some reason to set the DPI to 96 instead of 130+ like it should. | no desktop environment's interface is yet fully resolution independent Agreed: Things like the cover art images in my music player are rendered at a fixed pixel size and are therefore somewhat small. Of course, lying about the DPI doesn't actually fix this, but it does make it impossible to correct the problem at the source (e.g. make my music player display a 3x3 centimetre image instead of a 120x120 pixel image). | characters could be much larger than the interface elements they are | supposed to match I have never, ever seen this. It certainly isn't a problem in any of the GTK+, Qt, Open Motif or other applications that I use daily (including applications that don't use any toolkit at all). | Similar problems often occur on websites I concede this point. Of course, as hinted above, this could be solved by setting layout.css.dpi to 96 and disabling any minimum font size in the default firefox configuration (I don't know if other browsers have similar parameters). On a high resolution display, the price of this configuration is that you only get to gawk at pretty layouts instead of actually reading any information. Personally, I find the "back" feature of my web browser to be a suitable means of navigating websites which do not work on my computer. | ... as many users are accustomed to in Microsoft Windows and Apple OS X A serious question: How are high resolution displays usable at all on these operating systems, considering how bad things look on a 135 DPI when Xorg assumes it's a 96 DPI display? What do they do differently? | ... can still adjust the DPI value in the KDE or GNOME Control Center, | or simply increase the default font sizes. There are two obvious problems with this solution: 1) With hundreds of applications, with different mechanisms for setting their font sizes, one literally needs to edit dozens of config files to increase the default font sizes for all programs. In an ideal world, one would say "I like 9pt fonts for most text" and every program would use that, but this is sadly not the case today. 2) Even if you fix all the default font sizes, or adjust the DPI value in the KDE or GNOME Control Center, or with xrandr --fbmm for those who don't use KDE or GNOME, such that everything's perfect: you have to do it all over again when you change display devices. It also makes it impossible to share config files between computers (e.g. an NFS-mounted home directory). I don't actually care if the *default* behaviour for Xorg is to use 96 DPI unconditionally. My gripe is that there is no (documented) way to restore the autodetection (without patching the server): a config option such as the one introduced by my patch solves this issue 100% for me. And while the xrandr --fbmm hack posted above (that parses the xrandr output to get the physical size to pass to xrandr) works without patching Xorg, it's just too absurd to have to do that, and seems likely to fail in the future.
(In reply to comment #24) > I don't actually care if the *default* behaviour for Xorg is to use 96 > DPI unconditionally. My gripe is that there is no (documented) way to > restore the autodetection (without patching the server): a config option > such as the one introduced by my patch solves this issue 100% for me. May I second it. I am afraid, this discussion took the wrong track (while being quite interesting and exceptionally useful for me personally). It is not about whether 96DPI is better than any other. It is about leaving no choice to configure previous behaviour. That is the real bug.
(In reply to comment #20) > It was chosen in order to make display of web pages using Xorg more consistent > with the way they get displayed on Windows, which by default assumes 96. Is it an official position of X.org developers? Is it documented anywhere?
On Tue, Feb 16, 2010 at 10:58:07PM -0800, bugzilla-daemon@freedesktop.org wrote: > --- Comment #26 from Andrey Rahmatullin <wrar@altlinux.org> 2010-02-16 22:58:04 PST --- > (In reply to comment #20) > > It was chosen in order to make display of web pages using Xorg more consistent > > with the way they get displayed on Windows, which by default assumes 96. > > Is it an official position of X.org developers? Is it documented anywhere? Sure, consider it an official position. I don't think it's unreasonable. Especially if you assume that lower-DPI displays are likely to be higher-resolution and thus physically huge, meaning that people sit further away from them, and that displays with a meaningfully higher DPI are almost always found in phones (as well as some laptops) these days, meaning that people hold them a great deal closer to their face. I do agree that being able to configure the reported DPI (or just Option "DontForce96DPI") would be entirely useful, but I can't see us changing anything in the near future, particularly if it breaks web page display. Saying 'well, don't go to that website then' isn't helpful to anyone at all, and makes us look like we value strict technical correctness ('but don't you know what the true definition of a point is?!?') over an actual working system. While we do value strict technical correctness, we don't value it to the point of crippling everything else.
(In reply to comment #27) > I do agree that being able to configure the reported DPI (or just Option > "DontForce96DPI") would be entirely useful, So why not just add it (Option "DontForce96DPI") and finish this long thread for good?
(In reply to comment #27) > Sure, consider it an official position. I don't think it's > unreasonable. Especially if you assume that lower-DPI displays are > likely to be higher-resolution and thus physically huge, meaning that > people sit further away from them, and that displays with a meaningfully > higher DPI are almost always found in phones (as well as some laptops) > these days, meaning that people hold them a great deal closer to their > face. As was mentioned earlier in this thread, web browsers such as Firefox do the Right Thing(tm) by default on lower resolution displays, and don't need any help from Xorg. Firefox can be configured to do the same thing for high resolution displays, still without any help from Xorg. I agree that extremely low resolution displays are very likely to be TVs or projectors. > I do agree that being able to configure the reported DPI (or just Option > "DontForce96DPI") would be entirely useful, but I can't see us changing > anything in the near future, particularly if it breaks web page display. > Saying 'well, don't go to that website then' isn't helpful to anyone at > all, and makes us look like we value strict technical correctness ('but > don't you know what the true definition of a point is?!?') over an > actual working system. While we do value strict technical correctness, > we don't value it to the point of crippling everything else. When the DPI is falsely set to 96 on a high resolution laptop display, the result is *NOT* an "actual working system".
> --- Comment #28 from Andrey Borzenkov <arvidjaar@mail.ru> 2010-02-17 04:31:18 PST --- > So why not just add it (Option "DontForce96DPI") and finish this long thread > for good? > we're waiting for your patch.
(In reply to comment #30) > we're waiting for your patch. > Patch was already posted in comment #17.
(In reply to comment #29) > (In reply to comment #27) > While we do value strict technical correctness, > > we don't value it to the point of crippling everything else. > When the DPI is falsely set to 96 on a high resolution laptop display, > the result is *NOT* an "actual working system". Arguably it is "working", but "working" isn't _usable_ for many of us with less than perfect vision. cf. bug 26608
>Sure, consider it an official position. I don't think it's >unreasonable. Especially if you assume that lower-DPI displays are >likely to be higher-resolution and thus physically huge, meaning that >people sit further away from them, and that displays with a meaningfully >higher DPI are almost always found in phones (as well as some laptops) >these days, meaning that people hold them a great deal closer to their >face. The problem with fixed dpi is not really with very large screens - you're right, people use them from farther away; not with phones either - I sincerely doubt they use the default X configuration, so they don't really care about this. It is for laptops, what I would say the majority of the new linux users have - how many student have you seen in a university that use mainly a fixed pc? With laptops, you can't really choose the distance at which you sit from the screen, you're bound by the keyboard. Still, laptops and netbooks have greatly varying screen resolutions, from 96dpi to 150 and more. One size fits all is not going to work here. >Saying 'well, don't go to that website then' isn't helpful to anyone at >all, and makes us look like we value strict technical correctness ('but >don't you know what the true definition of a point is?!?') over an >actual working system. While we do value strict technical correctness, >we don't value it to the point of crippling everything else. When I recalled the definition of point, I did for a reason. Programs count on that to work correctly, to not show fonts too small to be easily read; to make "100% zoom" even make sense. At resolution over 130dpi (not rare at all today on portable systems) default fonts get hard to read if you stick to 96 "logical" dpi. Now, this is not what I'd call an actual working system. Moreover, if the problem resides in websites, than it must be addressed in browsers. I read that firefox already does this; I don't know if it really does, on Debian it uses system dpi, but then you should file a bug if you think it shouldn't; breaking the browser's more logical behaviour to fix broken web design is a solution, breaking the whole X is not... The only real argument against using the real screen size is for huge screens, like TV (for projectors the problem doesn't subsist, they don't have an intrinsic screen size.) I don't even know how many TVs report their real physical size on EDID; I tried only one and it didn't. (As for large computer screens, the most widespread size I see around these days is about 24" 16:9, 1920x1080 points. Which makes a lucky 96dpi, so they don't get hurt whatever the choice.) Anyway, if you care for the lots of huge LCD owners, you could set a maximum size over which switch to 96dpi. I ask you again to reconsider your decision.
I have a laptop with 14' LCD 1400x1050 (286mm x 214mm). So screen is very smooth and my correct DPI is about 124x124. Now, with forced 96x96 DPI I have to use very small fonts (DejaVu Sans 7pt) to get reasonable output on this screen. I think, there should be a freedom in choosing own DPI. Thanks. (II) intel(0): Output LVDS1 using initial mode 1400x1050 (II) intel(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated. (**) intel(0): Display dimensions: (286, 214) mm (**) intel(0): DPI set to (363, 486) ... (WW) intel(0): Option "DPI" is not used
This is really fun on an XO-1 with 200 resp. 267 DPI native resolution...
(In reply to comment #34) > I have a laptop with 14' LCD 1400x1050 (286mm x 214mm). So screen is > very smooth and my correct DPI is about 124x124. Now, with forced > 96x96 DPI I have to use very small fonts (DejaVu Sans 7pt) to get > reasonable output on this screen. If the X server resolution is set smaller than the actual, then fonts will be smaller than expected, not larger. > (**) intel(0): DPI set to (363, 486) This looks to me like a different problem: your log shows the resolution being set to a phenomenally huge (and not even square) value.
(In reply to comment #36) > > (**) intel(0): DPI set to (363, 486) > > This looks to me like a different problem: your log shows the resolution > being set to a phenomenally huge (and not even square) value. > I think that this line is some nonsense. But, how to force correct resolution without DPI option in the xorg.conf?
(In reply to comment #37) > But, how to force correct resolution without DPI option in the > xorg.conf? For non-xrandr-1.2 drivers, one can use DisplaySize [width] [height] option in xorg.conf. For xrandr-1.2 drivers, one can use xrandr --fbmm [width]x[height] after starting the server, but before starting any other clients. In either case, [width] and [height] are set in millimetres.
(In reply to comment #6) > The 'screen size' as reported in the core protocol now respects the DPI value > given by the user or config file and ignores the actual monitor DPI of any > connected monitor. > > The real monitor size is reported through the RandR extension if you really > need to know the physical size of the display. > > This is intentional and follows the practice seen in many other desktop > environments where the logical DPI of the screen is used as an application and > font scaling factor. Yes, and the default font size should be set to match the screen resolution, as is the practice in many other desktop environments. Note that even on Windows that completely ignore actual screen DPI the system integrators do preset the font scaling factor to approximately match the actual DPI. > > If you don't like it, you're welcome to configure the X server to taste; we > haven't removed any of those knobs. And, you can even use xrandr to change You just set them intentionally to the wrong value. Thank you very much.
*** Bug 27660 has been marked as a duplicate of this bug. ***
My workaround right now is set xrandr --dpi 130 in a file I put in my /etc/X11/xinit/xinitrc.d. I'd prefer not to do that. Thanks.
(In reply to comment #38) > (In reply to comment #37) > > But, how to force correct resolution without DPI option in the > > xorg.conf? > > For non-xrandr-1.2 drivers, one can use DisplaySize [width] [height] option > in xorg.conf. For xrandr-1.2 drivers, one can use xrandr --fbmm > [width]x[height] after starting the server, but before starting any other > clients. In either case, [width] and [height] are set in millimetres. This does not work for me. DPI ist stuck at 96x96 (I would like to use 135). I've tried to set it 1) via DisplaySize in xorg.conf 2) via xrandr --dpi 135 3) via xrandr --fbmm. xdpyinfo always reports a 96x96 DPI value. My video dirver is intel, X.Org X Server 1.7.6. Anything else I could try?
(In reply to comment #42) > ...for me. DPI ist stuck at 96x96 (I would like to use 135). > I've tried to set it > 1) via DisplaySize in xorg.conf > 2) via xrandr --dpi 135 > 3) via xrandr --fbmm. > xdpyinfo always reports a 96x96 DPI value. > My video dirver is intel, X.Org X Server 1.7.6. > Anything else I could try? It's possible your distro sets Xft.dpi at 96, which typically will override Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96, you need to find out where that's getting set and disable it, or change it to 135. It's also possible your DTE (Gnome?, KDE?, other?) is forcing 96. You'll have to look into its settings to see. xdpyinfo does not always report the DPI used by all apps. There are different ways an app can detect DPI. http://fm.no-ip.com/Auth/dpi-screen-window.html will report the DPI Firefox is using, which may or may not match what xdpyinfo reports, and likely won't if Xft.dpi is set and your screen is higher than typical resolution. A completely helpful response may depend on your video chip model, distro/version, & video driver/version in addition to your Xorg server version. The most appropriate place and/or time to run xrandr commands can vary due to distro-specific nuances in X implementation. To make -fbmm work in openSUSE 11.3, I had to put it in /etc/X11//xinit/xinitrc in the section labeled "#Add your own lines here...".
(In reply to comment #43) > (In reply to comment #42) > > ...for me. DPI ist stuck at 96x96 (I would like to use 135). > > I've tried to set it > > 1) via DisplaySize in xorg.conf > > 2) via xrandr --dpi 135 > > 3) via xrandr --fbmm. > > xdpyinfo always reports a 96x96 DPI value. > > My video dirver is intel, X.Org X Server 1.7.6. > > > Anything else I could try? > > It's possible your distro sets Xft.dpi at 96, which typically will override > Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96, > you need to find out where that's getting set and disable it, or change it to > 135. > Thanks for your fast reply! > It's also possible your DTE (Gnome?, KDE?, other?) is forcing 96. You'll have > to look into its settings to see. First of all, distribution is Arch Linux. I'm using e16 (version 1.0.2). As far as I know there is no way to set DPI in E (but I will check it). Display manager is xdm. I've checked everything under /etc/X11/xdm/, no configuration file mentions any DPI setting. > > xdpyinfo does not always report the DPI used by all apps. There are different > ways an app can detect DPI. http://fm.no-ip.com/Auth/dpi-screen-window.html > will report the DPI Firefox is using, which may or may not match what xdpyinfo > reports, and likely won't if Xft.dpi is set and your screen is higher than > typical resolution. The website from your link reports also 96 DPI. > > A completely helpful response may depend on your video chip model, > distro/version, & video driver/version in addition to your Xorg server version. > The most appropriate place and/or time to run xrandr commands can vary due to > distro-specific nuances in X implementation. To make -fbmm work in openSUSE > 11.3, I had to put it in /etc/X11//xinit/xinitrc in the section labeled "#Add > your own lines here...". OK. distro: Arch Linux, version 2010.05 Xorg version: 1.7.6 video device: Xorg.0.log reports it as (--) PCI:*(0:0:2:0) 8086:2a42:144d:c063 Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller rev 7, Mem @ 0xfa000000/4194304, 0xd0000000/268435456, I/O @ 0x00001800/8 (--) PCI: (0:0:2:1) 8086:2a43:144d:c063 Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller rev 7, Mem @ 0xfa400000/1048576 The vendor claims it to be an "Intel GMA 4500MHD". video driver: xf86-video-intel 2.10.0
(In reply to comment #44) > (In reply to comment #43) > > It's possible your distro sets Xft.dpi at 96, which typically will override > > Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96, > > you need to find out where that's getting set and disable it, or change it to > > 135.
(In reply to comment #45) > (In reply to comment #44) > > (In reply to comment #43) > > > It's possible your distro sets Xft.dpi at 96, which typically will override > > > Xorg's setting in at least some apps. If 'xrdb -query | grep dpi' produces 96, > > > you need to find out where that's getting set and disable it, or change it to > > > 135. Uh, sorry, have forgotten to post the result... xrdb -query gives no output. I've just set "Xft.dpi: 135" in /etc/X11/xdm/Xresources with the result that the fonts and the mouse pointer in xdm are bigger now so it seems that xdm has now the right DPI setting. As soon as I log in to e16 the mouse pointer gets smaller and xdpyinfo still reports the wrong DPI setting. Setting Xft.dpi in ~/.Xresources or /etc/X11/Xresources according to http://www.mozilla.org/unix/dpi.html has no effect, xrdb -query still gives no output.
(In reply to comment #46) > xrdb -query gives no output. That probably means something is unsetting Xft.dpi (making null) before the e16 desktop populates. (In reply to comment #42) > Anything else I could try? Grep through /etc for a startup script containing something akin to '-quiet -nolisten tcp vt5 -dpi 96 dpms"'. If you find something, change '-dpi 96' to '-dpi 135', or delete it. v1.7.6 is really too old to be discussing here. Bugzilla is not supposed to be a help forum. You have a distro and/or window manager specific problem and should try their help forums and/or bug trackers: http://www.archlinux.org/mailman/listinfo http://bbs.archlinux.org/ http://wiki.archlinux.org/ http://www.enlightenment.org/p.php?p=support&l=en irc://freenode/#e
FWIW, here is another +1 for the option to keep the correct DPI. I was just trying out nouveau and first thought it was a bug in the driver. The Debian package maintainer pointed me to this bug. I'm back to using nvidia for now, no need to go through extra hoops to use and test nouveau, when it sets the wrong DPI and I do need nvidia from time to time anyway (for the VDPAU support, my CPU is not powerful enough for 1080p movies). Many thanks to all Xorg maintainers for their great work on such an important piece of software. I do hope they reconsider this decision, or at least include the patch proposed above. Regards, Andrei
+1 here to restore original behavior. Or at least provide option to disable this MS way of handling screen resolutions
(In reply to comment #49) > +1 here to restore original behavior. Or at least provide option to disable > this MS way of handling screen resolutions In fact, this can no longer be validly called an "MS" way -- if your Windows 7 install disk has drivers for your video card, it will use correct DPI (rounded down to increments of 25%) automatically, by default! We used to have the same here with Xorg. We need an xorg.conf flag -- for example, NVIDIA has "UseEDIDDpi". Also, to note: I don't have any specific definition of "correct" behavior in multi-monitor mixed-DPI environments. I'm guessing Windows 7 may set it to the higher of the two -- err on the side of "too big", not "aaugh, I can't read anything!"
I wasn't aware about Win7. So this is a new "x11 way" =) As about multiple different monitors (like laptop + monitor), I don't know how is better. Mac OS just uses DPI from primary one (if it use it at all) and it sucks... because just moving window from one display to another changes it physical size....
(In reply to comment #51) > As about multiple different monitors (like laptop + monitor), I don't know > how is better. Mac OS just uses DPI from primary one (if it use it at all) > and it sucks... because just moving window from one display to another > changes it physical size.... In a non-Zaphod multi-head configuration, I don't think there's any "right" answer, especially when you consider that outputs can overlap. So you just need to pick something arbitrarily in this case. No matter what choice you make there will be very weird consequences, such as what you describe above.
The correct DPI is needed for legibility when the same user uses the same font size settings on multiple monitors. When the user sets the base font size to be legible in on 1 monitor and then goes to another which has a higher (true) DPI he will get poorly legible fonts. The correct DPI is needed for WYSIWYG, which is a holy grail of desk top publishing (DTP). X Windows applications increasingly aspire to be suitable for DTP. Look into the future. Monitors have not attained the highest useful DPI. This is only going to become a more pronounced deficiency as monitors increase in DPI. In fact, I suspect that monitor DPI has been somewhat held back by lack of proper support for automatically setting DPI. If current common monitors were completely adequate, we would not care about antialiasing or font hinting so much. 1200DPI or higher is found in printers and I am inclined to think it is not completely a marketing gimmick. 96DPI printers are practically nonexistant. I used to make fun of how MS Windows could not manage to set the DPI correctly automatically, unlike my Xorg, to friends and co-workers. No one attempted to rebut it. Instead of being "compatible with old versions of MS Windows" better is "compatible with reality". Playing a hoax on programmers by default is a bad thing.
Is this bug fixed in a newer release? A fix would be using the original behaviour or providing an option to revert to the original behaviour. I got a new monitor with 109 DPI yesterday and started to struggle with too small font sizes. I was not aware of this bug because my old monitor had approximately 96 dpi. In my opinion it is a wrong decision to not use the correct dpi of the device. This is the best choice one can make to get readable fonts. If one wants to consider the different viewing distance for projectors then the distance should be added as an additional option to the projector or to the X server such that the effective DPI can be calculated. Using a fixed value of 96 is plain wrong. It is also a problem that each tool starts to mess with the DPI settings: - X ignores the correct setting and uses 96 DPI - Firefox has its own DPI emulation - KDE has a setting for the DPI it should assume - For Scribus and Gimp one has to set the correct DPI in the settings. Relying on the server is no longer possible. - The latex beamer package assumes that slides are 15cm x 10cm to account for the larger viewing distance with projectors. Instead of having a simple central spot of DPI configuration the user has to cope with increased complexity. I also think that the website argument is not relevant. I've used a notebook for several years with 144 DPI. This was before the discussed changes and X used the correct DPI value. Some websites had problems with too small graphical elements compared to the font but I cannot remember that this made a website completely unusable. It scares me to imagine how the fonts would look like with 96 DPI on such a device. Finally I could always show Windows users that with X11 WYSIWYG is possible out-of-the-box while on their system the printout does not match the shown contents. With the introduction of this bug this is no longer possible.
Please use the correct dpi by default.
(In reply to comment #55) > Please use the correct dpi by default. If it doesn't, it's usually: 1-your distro's fault, or 2-your hardware's fault, or 3-your use of unequal DPI multiple displays, which is a complicated problem for X to deal with X automagic configuration will fall back to 96 if there's a problem with your hardware supplying the info automagic needs to work, while many distros force 96, in part because it makes web browsers behave like Windows on a not insignificant number of web sites, and in another part because DTEs are behind the development curve on adapting fonts, icons and initial window sizes to high DPI displays. Resolution independence is needed, but not available, either from most web sites, or most DTE components. OTOH, as long as you're sticking to one display, you can force correct DPI via xrandr, xorg.conf.d/, xorg.conf, xinitrc, proprietary nvidia driver and/or several other documented ways if bothered by inaccurate automagic or distro forcing. If your distro maker is forcing 96, complain to it. Just because xorg.conf is not used by default does not mean you cannot use it to enjoy accurate DPI.
(In reply to comment #56) > (In reply to comment #55) > > Please use the correct dpi by default. > > If it doesn't, it's usually: > > 1-your distro's fault, or > 2-your hardware's fault, or > 3-your use of unequal DPI multiple displays, which is a complicated problem for > X to deal with > > X automagic configuration will fall back to 96 if there's a problem with your > hardware supplying the info automagic needs to work, [...] Isn't this bug precisely about X.org *always ignoring* the hardware DPI value from the EDID? (as per comment #5 and comment #6 and commit http://cgit.freedesktop.org/xorg/xserver/commit/?id=fff00df94d7ebd18a8e24537ec96073717375a3f )
Created attachment 43316 [details] Xorg.0.log from server 1.9.3 using 86x86 DPI on i845G (In reply to comment #57) > Isn't this bug precisely about X.org *always ignoring* the hardware DPI value > from the EDID? There may have been a time when that was the case, but this was filed by an Intel video user long before KMS was required by the Intel driver. Obviously from the attachment, using no xorg.conf, xrandr, xorg.conf.d/ or anything else to force DPI, "always" is not the much more recent case. Display used is 20" 4:3 1400x1050 Viewsonic several years old.
(In reply to comment #58) > There may have been a time when that was the case, but this was filed by an > Intel video user long before KMS was required by the Intel driver. Obviously > from the attachment, using no xorg.conf, xrandr, xorg.conf.d/ or anything else > to force DPI, "always" is not the much more recent case. Display used is 20" > 4:3 1400x1050 Viewsonic several years old. From your own log: [ 35.477] (**) intel(0): Display dimensions: (410, 310) mm [ 35.477] (**) intel(0): DPI set to (86, 86) ... [ 35.914] (II) intel(0): Setting screen physical size to 370 x 277 Which means: $ qalc '1400 / (370 mm to inch)' 1400 / (370 * millimeter) = approx. 96.108108 / in $ qalc '1050 / (277 mm to inch)' 1050 / (277 * millimeter) = approx. 96.281588 / in Wanna' bet 'xdpyinfo | grep reso' on your machine shows resolution: 96x96 dots per inch ? :) Regards, Andrei P.S. I get the same "resizing" to 96 dpi with latest Xorg + nouveau from Debian unstable, so this is not limited to intel
Created attachment 43320 [details] 86 DPI screenshot What do I win? :-)
(In reply to comment #60) > Created an attachment (id=43320) [details] > 86 DPI screenshot > > What do I win? :-) AFAIK this issue affects RandR 1.2 capable drivers only, and unless I'm mistaken MGA is not such a driver.
(In reply to comment #60) > Created an attachment (id=43320) [details] > 86 DPI screenshot > > What do I win? :-) You posted a log with intel, but your screenshot says mga... Regards, Andrei
The bug is marked general, not Intel. Various comments spoke of at least one other chip prior to my MGA comment 58. Before your comment 58 reached my eyes, I had to put that Intel host aside to work on the MGA host, so I used that to demonstrate what I wrote in comment 58, that "always" is incorrect. That said, I reread comment 6, which does indicate manual intervention via config file or xrandr is normally required to achieve accurate DPI, normal meaning ATI, Intel or NVidia, chips that have current version xrandr-supported drivers.
(In reply to comment #63) > The bug is marked general, not Intel. Various comments spoke of at least one > other chip prior to my MGA comment 58. Before your comment 58 reached my eyes, > I had to put that Intel host aside to work on the MGA host, so I used that to > demonstrate what I wrote in comment 58, that "always" is incorrect. Fair enough. > That said, I reread comment 6, which does indicate manual intervention via > config file or xrandr is normally required to achieve accurate DPI, normal > meaning ATI, Intel or NVidia, chips that have current version xrandr-supported > drivers. I don't want to push too much, but up until now all feedback to this bug has been (or at least this is how it reads to me): - autodetection of DPI is deprecated - 96 dpi ought to be enough for anybody :) Does this mean you agree that under normal circumstances the DPI should be the one detected from EDID and 96 is just a fall-back value in case auto-detection fails? Because I don't believe anyone here disagrees 96 should be uses as *fall-back*, we just don't want it *overriding* correctly auto-detected values. Thanks for reading, Andrei
Guess this is an invite to leave for Wayland or whatever. My current screen is 96 dpi (give or take a pixel) so I could care less but the moment I switch to a display with different resolution I will have to put up with this Xorg's insanity as well. If Xorg is going to be bug-compatible with Windows, and only Windows default settings, not Windows preinstalled by system integrators on all laptops with hi-res screens we can as well run ReactOS. The user experience will, again, be terrible, much worse than Windows. You should also add some (*((int (*)())0))(); to make it feel more like badly configured Windows. If you are really concerned that DPI alone is not doing the right thing then you should add a mechanism to adjust it per display type, not just flat force everything to some random bogus value. Eg. if you had a statistical study that says that people usually sit at a distance from LVDS which is about 2/3 the distance to TMDS then *leave the detected DPI alone* and add a *magnification factor* of 0.66 which modifies the reported DPI to be 0.66 of the actual DPI. Everyone can easily set to 1 if they want DTP-correct display and tune to a value that suits better their use of the system. This *also* handles displays that have non-square pixels correctly. You need a correct baseline to magnify from to get a magnification factor, not a brain damage factor. You can use the same magnification factor to set the resolution of something you think is a TV to flat 96dpi and still preserve the option to easily revert to the actual reported DPI.
Okay, I meant to post this a while ago and forgot, but the recent churn has reminded me to do so. I'll start with the tl;dr: I don't think this bug matters anymore. It's been over a year since this bug was opened, and during that time I've come to the conclusion that one should simply *always* specify DisplaySize in xorg.conf. This bug does not affect any setups that have a DisplaySize for each monitor: the only issue is DPI computation for monitors that have no DisplaySize. Lets look at the situation beforn and after the change: With the old behaviour, computing the resolution from the EDID works only if your monitor doesn't lie about its size (which happens far more often than it should). Otherwise, you have to set DisplaySize. With the new behaviour, things work fine as long as 96dpi is correct for your monitor. Otherwise, you have to set DisplaySize. So regardless of which way we pick, some people need to add DisplaySize directives to their xorg.conf. We could probably argue all day about which way is "better", but I recommend a much simpler solution: grab a ruler and put DisplaySize directives in your xorg.conf. This bug was only a problem because the change broke working server configurations (this sort of issue seems to happen all the time with the X.org server). While breaking working setups is bad, I suspect that the vast majority of people who ran into problems because of this bug have long since updated their xorg.conf. At this point, changing it back will probably break just as many setups for marginal utility.
Let me disagree: I keep using my laptop either on a 17" monitor, or on a 21" monitor, or on the internal LVDS. In both 17" and 21" monitor cases, I do want 100% zoom to really mean 100% zoom, so I need correct DPI. Your reasoning leads to having to modify xorg.conf each time I switch, which means twice per day in my case...
> --- Comment #67 from Samuel Thibault <samuel.thibault@ens-lyon.org> 2011-02-13 14:27:50 PST --- > In both 17" and 21" monitor cases, I do want 100% zoom to really mean 100% > zoom, so I need correct DPI. > No, you need whatever app you're using to get the info about screen size from somewhere it has a decent chance of being correct (randr), instead of Display{Height,Width}MM.
(In reply to comment #67) > Let me disagree: I keep using my laptop either on a 17" monitor, or on a 21" > monitor, or on the internal LVDS. > > In both 17" and 21" monitor cases, I do want 100% zoom to really mean 100% > zoom, so I need correct DPI. I don't understand what you mean by "100% zoom", but I understand the need for correct DPI. > Your reasoning leads to having to modify xorg.conf each time I switch, which > means twice per day in my case... We can assign monitor sections per-connector for randr-1.2 drivers, so we can easily arrange for the LVDS to always be right (since you presumably never change which monitor is attached to it). So in the case where monitors are swapped regularly, I guess this still is a problem. I still think that changing the default behaviour now is a bad idea. So let's get a server option to enable autodetection. I think someone posted a patch for that earlier...
Why should it have to care about screen size? 100% zoom is supposed to be "app DPI matches screen DPI", it has nothing to do with the whole screen.
(In reply to comment #66) > Okay, I meant to post this a while ago and forgot, but the recent churn has > reminded me to do so. I'll start with the tl;dr: > > I don't think this bug matters anymore. > > It's been over a year since this bug was opened, and during that time I've > come to the conclusion that one should simply *always* specify DisplaySize in > xorg.conf. This bug does not affect any setups that have a DisplaySize for > each monitor: the only issue is DPI computation for monitors that have no > DisplaySize. Lets look at the situation beforn and after the change: > > With the old behaviour, computing the resolution from the EDID works only if > your monitor doesn't lie about its size (which happens far more often than > it should). Otherwise, you have to set DisplaySize. Yes, only when your screen is broken. > > With the new behaviour, things work fine as long as 96dpi is correct for > your monitor. Otherwise, you have to set DisplaySize. Which is not the case about half of the time, and is becoming increasigly rare. > > So regardless of which way we pick, some people need to add DisplaySize > directives to their xorg.conf. We could probably argue all day about which > way is "better", but I recommend a much simpler solution: grab a ruler and > put DisplaySize directives in your xorg.conf. I don't want to measure my screen every time I install Xorg somewhere. > > This bug was only a problem because the change broke working server > configurations (this sort of issue seems to happen all the time with the > X.org server). While breaking working setups is bad, I suspect that the vast > majority of people who ran into problems because of this bug have long since > updated their xorg.conf. At this point, changing it back will probably break > just as many setups for marginal utility. If the screen lies then you are free to take it and beat some of the manufacturer staff with it until they fix it. I suggest somebody with a 120+ dpi screen burns a bag of Xorg CDs an beats some Xorg developer with it until they fix the X server since they don't seem to respond to logical reasoning.
(In reply to comment #50) > We need an xorg.conf flag -- for example, NVIDIA has "UseEDIDDpi". Definitely. Is it that hard to implement?
(In reply to comment #72) > (In reply to comment #50) > > We need an xorg.conf flag -- for example, NVIDIA has "UseEDIDDpi". > > Definitely. Is it that hard to implement? No. A patch was attached to this report a year ago.
Created attachment 45751 [details] 120 DPI full 1600x1200 desktop screenshot This demonstrates the current state of elevated actual DTE DPI affairs, as of the release of openSUSE 11.4 about 5 weeks ago. Video chip is rv380. Driver is radeon. 120 DPI is achieved exclusively via DisplaySize in xorg.conf. Note that all apps are using the same font for the UI (urlbars, main menu), but that web browser viewports are of two schools: 1-Those forcing 96 DPI (WebKit in Google Chrome 7.0.517.44 and Epiphany 2.30.6, Opera 11.01) 2-Those using the DTE's DPI. (Gecko pre-rv2.0 in Firefox 3.6.16, KHTML in Konqueror 4.6) Note that Gecko rv2+ also forces to 96 in the usual cases, but provides a workaround (mozmm) that allows pages to size to match DTE DPI, which http://fm.no-ip.com/Auth/Font/font-vera.html in the screenshots does, and means all of Firefox 3.x, SeaMonkey 2.0.x, Firefox 4.x & SeaMonkey 2.1x will render it the same if the DTE DPI is 96 or above.
Created attachment 45875 [details] setting DisplaySize
(In reply to comment #74) > Created an attachment (id=45751) [details] > 120 DPI full 1600x1200 desktop screenshot > > This demonstrates the current state of elevated actual DTE DPI affairs, as of > the release of openSUSE 11.4 about 5 weeks ago. Video chip is rv380. Driver is > radeon. 120 DPI is achieved exclusively via DisplaySize in xorg.conf. Note that > all apps are using the same font for the UI (urlbars, main menu), but that web > browser viewports are of two schools: Thanks, I can confirm this with nouveau. However, as one can see in my attachment this has to be set for every monitor and then the correct Monitor section associated with the output. Kind of complicated if you often switch monitors... Wouldn't it make more sense to have a "UseEDIDDpi" option as already suggested above (and already used by nvidia)? Thanks, Andrei
Retitling this bug, since it affects current X servers as well, though possibly only randr-1.2-capable drivers as mentioned in comment 63 and the immediately preceding comments. This needs fixing. Each time I set up a new system, I end up with incredibly tiny fonts, and I end up having to manaully fix (and hardcode) the DPI to use the correctly autodetected value. The server already has a flag to override DPI with a constant value (namely -dpi). It doesn't need a flag to go back to autodetection, because it needs to do that by default. Nothing needs the X server to hardcode 96 DPI; browsers can already do that if they want to. (Chrome already does so, in a misguided attempt to force a constant ratio between "px" and "pt" across all websites. Firefox still seems to respect the system DPI by default, AFAICT.) Meanwhile, having the correct DPI makes fonts have reasonable sizes, and makes "100% zoom" in various applications (such as evince) match the physical dimensions of a page.
(In reply to comment #77) > (Chrome already does so, in a misguided attempt to force a constant ratio > between "px" and "pt" across all websites. Chrome does because it's built on WebKit, where 96 as the only possibility is the traditional Macintosh way, which became also the Internet Explorer default behavior back in v7 or v8. > Firefox still seems to respect the system DPI by default, AFAICT.) Since 2010/08/18 in https://bugzilla.mozilla.org/show_bug.cgi?id=537890 it only does so in the UI, not in web page content, where it now copies the IE, Chrome and Safari insanity. You can see the impact by using it to view both http://fm.no-ip.com/Auth/dpi-screen-windowg.html (old edition that worked as expected before that "fix"), and http://fm.no-ip.com/Auth/dpi-screen-window.html (modified version that can be accurate only in Geckos, Konq, and most old browsers). Its behavior can be impacted by two hidden preferences, layout.css.dpi (integer) & layout.css.devPixelsPerPx (a string parsed as a float), the former of which defaults to -1, and the latter to 1.0.
Take it from a typographer http://www.alistapart.com/articles/realfontsontheweb "We have a world of display devices that have standardized to report their exact resolution, the space it occupies, and thus the pixels per inch, a key to moving text typography forward." Who wants to break the news to him? Welcome to the past Mr. David Berlow.
(In reply to comment #27) > On Tue, Feb 16, 2010 at 10:58:07PM -0800, bugzilla-daemon@freedesktop.org > wrote: > > --- Comment #26 from Andrey Rahmatullin <wrar@altlinux.org> 2010-02-16 22:58:04 PST --- > > (In reply to comment #20) > > > It was chosen in order to make display of web pages using Xorg more consistent > > > with the way they get displayed on Windows, which by default assumes 96. > > > > Is it an official position of X.org developers? Is it documented anywhere? > > Sure, consider it an official position. I don't think it's > unreasonable. Especially if you assume that lower-DPI displays are > likely to be higher-resolution and thus physically huge, meaning that > people sit further away from them, and that displays with a meaningfully > higher DPI are almost always found in phones (as well as some laptops) > these days, meaning that people hold them a great deal closer to their > face. > > I do agree that being able to configure the reported DPI (or just Option > "DontForce96DPI") would be entirely useful, but I can't see us changing > anything in the near future, particularly if it breaks web page display. > Saying 'well, don't go to that website then' isn't helpful to anyone at > all, and makes us look like we value strict technical correctness ('but > don't you know what the true definition of a point is?!?') over an > actual working system. While we do value strict technical correctness, > we don't value it to the point of crippling everything else. Why are you trying to fix browser bugs in X? If you want to fake the DPI to 96, that should be a configuration in XRandr, or something, that people can easily disable, but it would be a workaround, not a fix. Setting a fixed DPI is completely and totally the wrong thing to do. Say, my laptop's display has a DPI of 142, and then I plug my display of 101 DPI, the fonts will look completely different, and I would be forced to change the font settings (if there are any). Do you seriously think that people changing their font settings each time they plug/unplug external displays is ideal?
Please add an option (command-line or config file) to make the X server stop lying about the physical DPI of the display device. I understand all the arguments for pegging DPI at 96, but *I* still want my X server to report the actual DPI as read from the EDID. Maybe add an -autodpi command-line option? Or support "-dpi auto" with 96 being the default.
I'm seeing this bug too with nouveau driver. I've set DisplaySize to 474 296. At first it looks okay: [ 45437.510] (**) NOUVEAU(0): Display dimensions: (474, 296) mm [ 45437.510] (**) NOUVEAU(0): DPI set to (90, 90) and sets it to 96 dpi... [ 45437.530] (II) NOUVEAU(0): Setting screen physical size to 444 x 277 environment: Gentoo x86_64 Linux 3.0.4 Xorg Server 1.11.0 Nouveau DDX driver from GIT.
Created attachment 51062 [details] X log from 1.11.0
I'm seeing this bug too with the nouveau driver: [ 745.858] (**) NOUVEAU(0): Display dimensions: (330, 210) mm [ 745.858] (**) NOUVEAU(0): DPI set to (129, 127) ... [ 745.892] (II) NOUVEAU(0): Setting screen physical size to 444 x 277 I noticed this after switching from the proprietary nvidia driver, and this is a *huge* regression for me. X should *not* try to fix bugs in some programs by lying to all of them. Right now I'm using "xrandr --fbmm" from my .xinitrc, but this is just an ugly hack -- I'll probably get back to nvidia next time I have to plug my laptop to a projector. Environment: Arch Linux x86_64, Linux 3.0.4, xorg-server 1.11.0.
Was it really fixed??? I can't find any related fixed in git's master.
PLEASE stop reopening or discussing this in here. This is not a proper place for such discussion - this is not a forum or mailing list but bug reporting software and xorg developers concluded this is NOT A BUG. If you really want some changes in DPI area then please discuss it through proper channels like xorg@ mailing list (http://lists.freedesktop.org/mailman/listinfo/xorg) first. Thank you.
I've created feature request to add patch from comment 17: https://bugs.freedesktop.org/show_bug.cgi?id=41115
However you look at it this is a bug. Xorg changed from working to bogus and broken behaviour.
(In reply to comment #20) > It was chosen in order to make display of web pages using Xorg more consistent > with the way they get displayed on Windows, which by default assumes 96. It's called bug compatibility, and there's a knob in a browser for that already. -- 1400x1050@122, 1600x1200+1920x1080@varied
Could someone post where exactly discussion continues? It seems that it's still problematic. How Windows/Mac solving this DPI think?
(In reply to comment #91) > How Windows/Mac solving this DPI think? Windows has never matched display density to desktop pixel density, usually assuming 96 back at least to Win95. I don't think Mac ever has either. Display density auto-detection used to be a big advantage of X on Linux over both Mac & Windows.
> Windows has never matched display density to desktop pixel density, usually assuming 96 back at least to Win95. Wrong. Windows 7 uses correct (though rounded) scaling automatically: http://technet.microsoft.com/en-us/library/ff716252.aspx
(In reply to comment #93) > Wrong. Windows 7 uses correct (though rounded) scaling automatically: > http://technet.microsoft.com/en-us/library/ff716252.aspx That rounding applied is too coarse to be considered "correct" or "matching". The mismatch can be reduced, but in most cases does not resemble the virtually exact matching the xserver used to do automatically[1]. In the table at that Technet URL, the average applied DPI is 74% of the physical DPI, with variation between zero and -50.5%, and no variation is above zero, always erring on the side of smaller rather than larger object sizes. [1] https://bugs.freedesktop.org/show_bug.cgi?id=41115#c18
*** Bug 99369 has been marked as a duplicate of this bug. ***
I am no longer an employee of NSWCDD. Please contact Robin Ross <robin.ross@navy.mil> or Kevin Pitts <kevin.pitts@navy.mil> to obain an alternate POC.
Some more details: We have one X Screen which consist of one or more monitor outputs. Via xrandr it is possible to query size (in mm) and resolution of each monitor output, which allows also to calculate DPI (for each output). But xdpyinfo provides on stdout some information about X Screen, not about individual monitor outputs. Therefore X Screen dimensions in xdpyinfo is the smallest rectangular which covers all monitor outputs. And the whole reported bug is about fact that "xdpyinfo | grep -A 2 ^screen" reports incorrect dimensions and DPI. But as X Screen covers all connected monitors, it does not make sense to talk about DPI or dimension here. For example for configuration when two connected monitors to X Screen have different DPI. Probably this is reason why all people in this bug report are confused that X server report incorrect DPI. I would say there is really bug that xdpyinfo provides dimensions for whole X Screen (which does not make sense at all). Rather it should xdpyinfo should show dimensions and DPI per monitor output via XRANDR extension. It is possible to fix xdpyinfo application? So users would not be confused anymore?
(In reply to Pali Rohár from comment #97) > It is possible to fix xdpyinfo application? It might be just the messenger -- didn't look into data structures even with grep as it's not that important: those really caring for consistent image on multi-screen displays will care for DPI consistency across those as well; and those who do not (or cannot) won't be "saved" with 96 dpi nailed down either. For one, I've got a system with 20" 1600x1200 and 22" 1920x1080 screens (both are reasonable quality IPS monitors I don't intend to dump so far), and a laptop with 13" 1920x1080 IPS panel and an occasional external monitor so I do have some experience with this particular use case.
I sent patch to xorg-devel mailing list which should fix this problem in xdpyinfo and report correct dimensions and DPI information. It is there: https://lists.x.org/archives/xorg-devel/2017-April/053430.html
(In reply to Pali Rohár from comment #97) > Probably this is reason why all people in this bug report are confused that > X server report incorrect DPI. This bug report is not at all restricted to multi-screen setups.
(In reply to Yves-Alexis from comment #100) > This bug report is not at all restricted to multi-screen setups. I just show example why DPI information from X Screen does not make sense. As written in first post, xdpyinfo show incorrect DPI, but xrandr correct. And my patch linked to previous post fixes this problem in xdpyinfo.
(In reply to Pali Rohár from comment #99) > I sent patch to xorg-devel mailing list which should fix this problem in > xdpyinfo and report correct dimensions and DPI information. It is there: > https://lists.x.org/archives/xorg-devel/2017-April/053430.html I can confirm that this patch works. However there are problems when the physical size of an output cannot be detected: screen #0: output: LVDS1 dimensions: 1920x1080 pixels (340x190 millimeters) resolution: 143x144 dots per inch output: VIRTUAL6 dimensions: 3440x1440 pixels (0x0 millimeters) resolution: -2147483648x-2147483648 dots per inch
Thanks for testing! Fix is there: https://lists.x.org/archives/xorg-devel/2017-May/053743.html Now it should report 0x0 dots per inch instead trying floating-point division by zero and reporting some (negative) infinity.
(In reply to Pali Rohár from comment #103) > Thanks for testing! Fix is there: > https://lists.x.org/archives/xorg-devel/2017-May/053743.html > > Now it should report 0x0 dots per inch instead trying floating-point > division by zero and reporting some (negative) infinity. Looking better now, thanks :)
I'm just trying to follow the discussion on this "bug". Can someone explain how to manually change the DPI? I tried passing a "--dpi" option to startx and I tried changing the DPI using "xrandr --dpi ...". In neither instance was I able to produce a visible effect in applications like Firefox or Gedit. It would be very useful if someone could point to a tutorial with an example that I can use as a starting point. I tried this and it only seemed to change the font of my window manager i3 (after restarting it): https://wiki.archlinux.org/index.php/xorg#Setting_DPI_manually
Frederick, highly likely you were foiled by the GTK toolkit used to build Firefox and Gedit. I've added a new subsection to that wiki URL to address that probability.
@Frederick Eaton: I tried to document what xrandr --dpi option is doing to avoid confusion, see my patch: https://lists.x.org/archives/xorg-devel/2017-May/053796.html It does not set or change DPI for monitor output, but for Xscreen which consist of all connected outputs.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.