Bug 85879 - RFE: absolute mode for logogram input on a touchpad
Summary: RFE: absolute mode for logogram input on a touchpad
Status: RESOLVED MOVED
Alias: None
Product: Wayland
Classification: Unclassified
Component: libinput (show other bugs)
Version: unspecified
Hardware: Other All
: low enhancement
Assignee: Wayland bug list
QA Contact:
URL:
Whiteboard:
Keywords:
: 86376 (view as bug list)
Depends on:
Blocks:
 
Reported: 2014-11-05 00:15 UTC by Peter Hutterer
Modified: 2017-06-15 03:12 UTC (History)
6 users (show)

See Also:
i915 platform:
i915 features:


Attachments
A draft of the interface (12.73 KB, patch)
2016-06-17 05:57 UTC, Peter Hutterer
Details | Splinter Review

Description Peter Hutterer 2014-11-05 00:15:41 UTC
By default touchpads are in relative mode, translating finger input to pointer coordinates or other events (scrolling, tapping, ...).

For the input of non-latin characters, the touchpad should be able to forward touchpoints directly to a client for interpretation.

http://support.apple.com/en-us/ht4288 has the details on how it works on a Mac
Comment 1 Jonas Ådahl 2014-12-10 09:43:54 UTC
Sounds like this needs Wayland protocol as well, as wl_touch is in surface coordinates, while it sounds like this feature would need something more output surface independent. Pretty much somewhere between wl_touch and wl_tablet?
Comment 2 Peter Hutterer 2014-12-10 22:13:10 UTC
There is an experimental plugin for handwrite in ibus here:
https://github.com/epico/ibus-handwrite
and a Fedora COPR
http://copr.fedoraproject.org/coprs/pwu/handwrite/

fwiw, I had to yum reinstall "ibus-*" after installing from source, that triggers something to get the engine listed. Then you can add it to the IME in GNOME and trigger the binary in src.

Long story short: this displays a touchpad-shaped surface that reflects the touches and translates it to characters. So once the touchpad is in that mode, the input would be fixed-focus but can be surface-relative. Of course, ideally the surface displayed should mirror the actual shape of the touchpad which isn't possible through the wayland protocol yet.

So I think wl_touch as a protocol may just be fine, but whether we make it a side-channel or how to transmit the size of the touchpad is an open question. Plus the bit to ask the compositor to switch the touchpad into that mode, and which device to switch to that mode.
Comment 3 Jonas Ådahl 2014-12-11 01:33:18 UTC
Moving both conversations (this and Bug 87134) here: To use wl_touch we'd have to be able to wl_seat.get_touch in order to get such a device, so you mean that the hand writing client would ask for this feature some how and at that time the touchpad would stop being a touchpad and instead advertise itself as a wl_touch device? Should it only do that to that given client then, and what happens when the touch mode is disabled?

It sounds a bit complex to do it that way. Can't we do it similar wl_relative_pointer and with wl_locked_pointer, and always advertise a wl_touchpad when there is a touchpad connected? Meaning when the hand writing client wants to do hand writing, it locks the pointer, gets the wl_touchpad (in the similar way as it'd get wl_relative_pointer). With a wl_touchpad we avoid making regular touch enabled clients to get confused by an extra wl_touch device, and we don't need to ignore the interface specification of the wl_touch device when making the touchpad communicate via it. We'd also avoid any touchpad mode changing and simply rely on the pointer lock to have the client not loose pointer focus. We'd also make it possible for other clients to potentially make use of the touchpad touch points for whatever it may want to use it for, without confusing it with a regular touch device.
Comment 4 Peter Hutterer 2014-12-11 02:55:55 UTC
wl_touchpad sounds like the best solution. I think within libinput we may still need to mode switch, but that can be done behind the scenes, we don't need to expose the exact behaviour on the protocol.

I don't have any other comments atm, might have to see how this looks implemented.
Comment 5 Peter Hutterer 2015-04-14 23:49:48 UTC
*** Bug 86376 has been marked as a duplicate of this bug. ***
Comment 6 Peter Hutterer 2016-06-17 05:51:20 UTC
Renaming, finally found the right word for it :)
Comment 7 Peter Hutterer 2016-06-17 05:57:22 UTC
Created attachment 124567 [details] [review]
A draft of the interface

First quick draft of the interface, without any actual implementation. Basic approach is: a config option that tells us if the direct mode is available. When enabled, the events are effectively touch events (but in a different interface) and with a few tiny difference in the semantics, e.g. libinput will always provide at least one slot.
Comment 8 Jonas Ådahl 2016-06-17 18:29:56 UTC
Comment on attachment 124567 [details] [review]
A draft of the interface

Review of attachment 124567 [details] [review]:
-----------------------------------------------------------------

::: src/libinput.h
@@ +541,5 @@
> + *
> + * Valid event types for this event are
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_DOWN,
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_MOTION,
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_UP.

Any use for a LIBINPUT_EVENT_TOUCHPAD_FRAME event?

@@ +543,5 @@
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_DOWN,
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_MOTION,
> + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_UP.
> + */
> +struct libinput_event_touchpad_touch;

Would it maybe be better to just group this as a libinput_event_touchpad event? I can't think of what other events than touch down/move/up, but grouping it under "touchpad" would at least not block us/make it awkward to add some other touchpad specific event.

@@ +4954,5 @@
> + * require changing to a neutral state first before activating the new
> + * method.
> + *
> + * Disabling the direct touch mode on a device that does not have a direct
> + * touch mode always succeeds.

Should it be specified what happens to active gestures (including two finger scroll, and tap) etc? Do they continue, or are they cancelled in some way?
Comment 9 Peter Hutterer 2016-06-18 02:55:54 UTC
I don't think a frame event is needed here. The history of that is a bit blurry anyway, AIUI the frame stems largely from the MT protocol A in Qt, but with the protocol B the frame event is to some extent superfluous (still useful, but just not as needed). This interface is aimed primarily at single-finger interactions, so i think a frame event is unnecessary here though it could be added later.

Renaming to libinput_event_touchpad sounds good, I'll do that in the next revision.

(In reply to Jonas Ådahl from comment #8)
> @@ +4954,5 @@
> > + * require changing to a neutral state first before activating the new
> > + * method.
> > + *
> > + * Disabling the direct touch mode on a device that does not have a direct
> > + * touch mode always succeeds.
> 
> Should it be specified what happens to active gestures (including two finger
> scroll, and tap) etc? Do they continue, or are they cancelled in some way?

That's what the sentence just above should've clarified, unless the device is in a neutral state the mode won't activate yet. so any current ongoing gestures have to end before the mode is activated. I'll try to add/reword this
Comment 10 Carlos Garnacho Parro 2016-06-18 09:55:49 UTC
Fwiw, the API makes sense to me. I first thought it's unfortunate to have only get_x/y (thus millimeters) if we want to map this somewhere on screen, but we need to deal with millimeters anyway because we don't know the device aspect ratio, so it's probably best to keep it as is.
Comment 11 Peter Hutterer 2016-06-18 13:07:43 UTC
fwiw, the aspect ratio is available through libinput_device_get_size(). I'm intentionally not providing the transformed_x/y calls though because as you said this needs to be handled in as-is physical coordinates, otherwise we'll just get squashed input.
Comment 12 Carlos Garnacho Parro 2016-06-18 14:25:14 UTC
(In reply to Peter Hutterer from comment #11)
> fwiw, the aspect ratio is available through libinput_device_get_size(). I'm

I know :). My train of thought was that, even if you have get_x/y_transformed calls, you still need this call in order to pass a proper width/height to those, so the whole thing would rely on millimeters after all. It's not like touchscreens where you can more or less assume a correlation between input and output ratios.

Anyway, we're agreeing :P
Comment 13 Peter Hutterer 2016-06-20 05:12:18 UTC
I've pushed a minimal implementation to this tree here which I'll keep updating as we go along.
https://github.com/whot/libinput/tree/wip/touchpad-logograms

There are few bits lacking (buttons, hysteresis) but it should be enough to get some PoC working on top of it.
Comment 14 Peter Hutterer 2017-06-15 03:12:04 UTC
Closing as moved. We roughly know what we want to do but the patch itself hasn't been touched (or tested?) in a year, so let's re-open this when we have an actual demand for it.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.