Bug 106785 - Pointer speed is too slow for hidpi displays; xorg is inconsistent with Wayland
Summary: Pointer speed is too slow for hidpi displays; xorg is inconsistent with Wayland
Status: RESOLVED MOVED
Alias: None
Product: xorg
Classification: Unclassified
Component: Input/libinput (show other bugs)
Version: unspecified
Hardware: All Linux (All)
: medium enhancement
Assignee: Peter Hutterer
QA Contact: Xorg Project Team
URL:
Whiteboard:
Keywords:
: 106737 (view as bug list)
Depends on:
Blocks:
 
Reported: 2018-06-02 13:25 UTC by Tony Houghton
Modified: 2018-08-10 20:57 UTC (History)
1 user (show)

See Also:
i915 platform:
i915 features:


Attachments

Description Tony Houghton 2018-06-02 13:25:06 UTC
This bug used to affect both xorg and Wayland; it's been fixed in Wayland <https://bugs.freedesktop.org/show_bug.cgi?id=99383> but not xorg. Well, not officially, but I found your patch for Fedora <https://bugzilla.redhat.com/show_bug.cgi?id=1413306#c54> and applied it to Arch, and it works well.

I added a second stanza to the example xorg config though, using MatchIsPointer (which doesn't seem to match my touchpad for some reason) instead of MatchIsTouchpad because I also wanted my mouse speed to be consistent in X and Wayland.

I see you're concerned that it might make some pixels unaddressable. Did you find out how the Wayland developers addressed that concern? My opinion is that it doesn't really matter, because the point of hidpi is to provide more pixels than a user can see individually, providing smoother text etc rather than more desktop space.
Comment 1 Peter Hutterer 2018-06-04 00:03:46 UTC
*** Bug 106737 has been marked as a duplicate of this bug. ***
Comment 2 Peter Hutterer 2018-06-04 00:13:01 UTC
The basic problem is: there is no strict definition of hidpi, the closest thing you get is the scale factor in the desktop environment. The libinput driver/libinput don't know anything about pixels. The X server does but it doesn't know whether 4000 pixels means "twice the density" or just "twice the pixels".

Example: if you have two monitors at 1920 that doesn't mean you want to scale up by factor 2 because you have a 4k visible area. (Let's ignore that X would know that it's two screens for this argument ;)

Our problem is "too many parts":
* libinput is in charge of pointer acceleration, but assumes 'normal' pixel density and leaves the rest up to the caller
* xf86-input-libinput just passes bits on
* the X server only sees pixels but doesn't know whether the desktop environment treats it as scaled hi-dpi or just really small pixels
* the desktop environment (usually) knows that but cannot tell anyone that this is the case.

On Wayland, the compositor knows about all this and gets the data from libinput. It can then multiply with the right factor and things work out magically.

Now, you say: why not just set DPIScaleFactor from the DE? Then I say, good point, but what are you going to do for the mixed-dpi case, given that xf86-input-libinput doesn't know which screen it's on? And that's just the first difficult corner-case :)
Comment 3 Tony Houghton 2018-06-04 11:40:34 UTC
(In reply to Peter Hutterer from comment #2)

[per-monitor dpi scaling]
 
> On Wayland, the compositor knows about all this and gets the data from
> libinput. It can then multiply with the right factor and things work out
> magically.

OK, that explains why it was easier to fix this in Wayland than X.

> Now, you say: why not just set DPIScaleFactor from the DE? Then I say, good
> point, but what are you going to do for the mixed-dpi case, given that
> xf86-input-libinput doesn't know which screen it's on? And that's just the
> first difficult corner-case :)

My understanding was that X doesn't really support different scale factors on different monitors anyway, so I thought in this case GNOME just applies the scale globally. Or does it have a workaround which works for rendering, but not for the pointer?

Hopefully I won't need X for much longer, but patched versions of Chromium which support VAAPI are available, and I think the feature is going to be available in releases soon. This is really useful, but it only works in X, not with XWayland, because the latter only supports DRI3 and VAAPI still only supports DRI2. But that is being worked on too.
Comment 4 Peter Hutterer 2018-06-05 00:42:31 UTC
> My understanding was that X doesn't really support different scale factors on
> different monitors anyway, so I thought in this case GNOME just applies the
> scale globally. Or does it have a workaround which works for rendering, but not
> for the pointer?

GNOME cannot apply any pointer motion scale in the X architecture, it can
only render its windows differently. And yes, the mixed dpi case in X is
still an unsolved problem. AFAIK the server doesn't change the
pointer speed right now  either so you need to find the middle ground
between "too fast on low-dpi" and "too slow on high-dpi" through normal
pointer speed configuration.The problem is simply that the server doesn't
know whether gnome renders everything twice as big so it cannot know to
scale the pointer up by factor 2.
Comment 5 Tony Houghton 2018-06-05 12:12:19 UTC
I think your Fedora patch is a good solution, and shouldn't cause problems in terms of forcing unwanted behaviour/appearance on users without hidpi monitors. So couldn't it be adopted upstream? Then would it be possible for GNOME/KDE to apply the setting on the fly instead of users having to edit an xorg.conf file? But I don't know if that's possible with X's config architecture/API.
Comment 6 Peter Hutterer 2018-06-11 03:30:08 UTC
The setting would have to be exposed as device property for anything on-the-fly to work but otherwise it'd be techincally possible. But in that case it should be moved to the server itself because it shouldn't be driver-dependent.

But it's still a hack, I'd need someone to really spend time on this and investigate whether it's possible at all to fix this issue in X and, if so, how.
Blue-tack development is fun but causes nightmares for maintainers down the road.
Your suggestion would integrate the hack more tightly: once gnome/kde start using it we cannot realistically get rid of the option anymore. So if anyone figures out the correct solution, we'd then also have to figure out a way to undo this hack where configured.
Comment 7 GitLab Migration User 2018-08-10 20:57:16 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/driver/xf86-input-libinput/issues/14.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.