This bug used to affect both xorg and Wayland; it's been fixed in Wayland <https://bugs.freedesktop.org/show_bug.cgi?id=99383> but not xorg. Well, not officially, but I found your patch for Fedora <https://bugzilla.redhat.com/show_bug.cgi?id=1413306#c54> and applied it to Arch, and it works well.
I added a second stanza to the example xorg config though, using MatchIsPointer (which doesn't seem to match my touchpad for some reason) instead of MatchIsTouchpad because I also wanted my mouse speed to be consistent in X and Wayland.
I see you're concerned that it might make some pixels unaddressable. Did you find out how the Wayland developers addressed that concern? My opinion is that it doesn't really matter, because the point of hidpi is to provide more pixels than a user can see individually, providing smoother text etc rather than more desktop space.
*** Bug 106737 has been marked as a duplicate of this bug. ***
The basic problem is: there is no strict definition of hidpi, the closest thing you get is the scale factor in the desktop environment. The libinput driver/libinput don't know anything about pixels. The X server does but it doesn't know whether 4000 pixels means "twice the density" or just "twice the pixels".
Example: if you have two monitors at 1920 that doesn't mean you want to scale up by factor 2 because you have a 4k visible area. (Let's ignore that X would know that it's two screens for this argument ;)
Our problem is "too many parts":
* libinput is in charge of pointer acceleration, but assumes 'normal' pixel density and leaves the rest up to the caller
* xf86-input-libinput just passes bits on
* the X server only sees pixels but doesn't know whether the desktop environment treats it as scaled hi-dpi or just really small pixels
* the desktop environment (usually) knows that but cannot tell anyone that this is the case.
On Wayland, the compositor knows about all this and gets the data from libinput. It can then multiply with the right factor and things work out magically.
Now, you say: why not just set DPIScaleFactor from the DE? Then I say, good point, but what are you going to do for the mixed-dpi case, given that xf86-input-libinput doesn't know which screen it's on? And that's just the first difficult corner-case :)
(In reply to Peter Hutterer from comment #2)
[per-monitor dpi scaling]
> On Wayland, the compositor knows about all this and gets the data from
> libinput. It can then multiply with the right factor and things work out
OK, that explains why it was easier to fix this in Wayland than X.
> Now, you say: why not just set DPIScaleFactor from the DE? Then I say, good
> point, but what are you going to do for the mixed-dpi case, given that
> xf86-input-libinput doesn't know which screen it's on? And that's just the
> first difficult corner-case :)
My understanding was that X doesn't really support different scale factors on different monitors anyway, so I thought in this case GNOME just applies the scale globally. Or does it have a workaround which works for rendering, but not for the pointer?
Hopefully I won't need X for much longer, but patched versions of Chromium which support VAAPI are available, and I think the feature is going to be available in releases soon. This is really useful, but it only works in X, not with XWayland, because the latter only supports DRI3 and VAAPI still only supports DRI2. But that is being worked on too.
> My understanding was that X doesn't really support different scale factors on
> different monitors anyway, so I thought in this case GNOME just applies the
> scale globally. Or does it have a workaround which works for rendering, but not
> for the pointer?
GNOME cannot apply any pointer motion scale in the X architecture, it can
only render its windows differently. And yes, the mixed dpi case in X is
still an unsolved problem. AFAIK the server doesn't change the
pointer speed right now either so you need to find the middle ground
between "too fast on low-dpi" and "too slow on high-dpi" through normal
pointer speed configuration.The problem is simply that the server doesn't
know whether gnome renders everything twice as big so it cannot know to
scale the pointer up by factor 2.
I think your Fedora patch is a good solution, and shouldn't cause problems in terms of forcing unwanted behaviour/appearance on users without hidpi monitors. So couldn't it be adopted upstream? Then would it be possible for GNOME/KDE to apply the setting on the fly instead of users having to edit an xorg.conf file? But I don't know if that's possible with X's config architecture/API.
The setting would have to be exposed as device property for anything on-the-fly to work but otherwise it'd be techincally possible. But in that case it should be moved to the server itself because it shouldn't be driver-dependent.
But it's still a hack, I'd need someone to really spend time on this and investigate whether it's possible at all to fix this issue in X and, if so, how.
Blue-tack development is fun but causes nightmares for maintainers down the road.
Your suggestion would integrate the hack more tightly: once gnome/kde start using it we cannot realistically get rid of the option anymore. So if anyone figures out the correct solution, we'd then also have to figure out a way to undo this hack where configured.
-- GitLab Migration Automatic Message --
This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.
You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/driver/xf86-input-libinput/issues/14.