Summary: | RFE: absolute mode for logogram input on a touchpad | ||
---|---|---|---|
Product: | Wayland | Reporter: | Peter Hutterer <peter.hutterer> |
Component: | libinput | Assignee: | Wayland bug list <wayland-bugs> |
Status: | RESOLVED MOVED | QA Contact: | |
Severity: | enhancement | ||
Priority: | low | CC: | alexepico, benjamin.tissoires, carlosg, jadahl, peter.hutterer, petersen |
Version: | unspecified | ||
Hardware: | Other | ||
OS: | All | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: | A draft of the interface |
Description
Peter Hutterer
2014-11-05 00:15:41 UTC
Sounds like this needs Wayland protocol as well, as wl_touch is in surface coordinates, while it sounds like this feature would need something more output surface independent. Pretty much somewhere between wl_touch and wl_tablet? There is an experimental plugin for handwrite in ibus here: https://github.com/epico/ibus-handwrite and a Fedora COPR http://copr.fedoraproject.org/coprs/pwu/handwrite/ fwiw, I had to yum reinstall "ibus-*" after installing from source, that triggers something to get the engine listed. Then you can add it to the IME in GNOME and trigger the binary in src. Long story short: this displays a touchpad-shaped surface that reflects the touches and translates it to characters. So once the touchpad is in that mode, the input would be fixed-focus but can be surface-relative. Of course, ideally the surface displayed should mirror the actual shape of the touchpad which isn't possible through the wayland protocol yet. So I think wl_touch as a protocol may just be fine, but whether we make it a side-channel or how to transmit the size of the touchpad is an open question. Plus the bit to ask the compositor to switch the touchpad into that mode, and which device to switch to that mode. Moving both conversations (this and Bug 87134) here: To use wl_touch we'd have to be able to wl_seat.get_touch in order to get such a device, so you mean that the hand writing client would ask for this feature some how and at that time the touchpad would stop being a touchpad and instead advertise itself as a wl_touch device? Should it only do that to that given client then, and what happens when the touch mode is disabled? It sounds a bit complex to do it that way. Can't we do it similar wl_relative_pointer and with wl_locked_pointer, and always advertise a wl_touchpad when there is a touchpad connected? Meaning when the hand writing client wants to do hand writing, it locks the pointer, gets the wl_touchpad (in the similar way as it'd get wl_relative_pointer). With a wl_touchpad we avoid making regular touch enabled clients to get confused by an extra wl_touch device, and we don't need to ignore the interface specification of the wl_touch device when making the touchpad communicate via it. We'd also avoid any touchpad mode changing and simply rely on the pointer lock to have the client not loose pointer focus. We'd also make it possible for other clients to potentially make use of the touchpad touch points for whatever it may want to use it for, without confusing it with a regular touch device. wl_touchpad sounds like the best solution. I think within libinput we may still need to mode switch, but that can be done behind the scenes, we don't need to expose the exact behaviour on the protocol. I don't have any other comments atm, might have to see how this looks implemented. *** Bug 86376 has been marked as a duplicate of this bug. *** Renaming, finally found the right word for it :) Created attachment 124567 [details] [review] A draft of the interface First quick draft of the interface, without any actual implementation. Basic approach is: a config option that tells us if the direct mode is available. When enabled, the events are effectively touch events (but in a different interface) and with a few tiny difference in the semantics, e.g. libinput will always provide at least one slot. Comment on attachment 124567 [details] [review] A draft of the interface Review of attachment 124567 [details] [review]: ----------------------------------------------------------------- ::: src/libinput.h @@ +541,5 @@ > + * > + * Valid event types for this event are > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_DOWN, > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_MOTION, > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_UP. Any use for a LIBINPUT_EVENT_TOUCHPAD_FRAME event? @@ +543,5 @@ > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_DOWN, > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_MOTION, > + * @ref LIBINPUT_EVENT_TOUCHPAD_TOUCH_UP. > + */ > +struct libinput_event_touchpad_touch; Would it maybe be better to just group this as a libinput_event_touchpad event? I can't think of what other events than touch down/move/up, but grouping it under "touchpad" would at least not block us/make it awkward to add some other touchpad specific event. @@ +4954,5 @@ > + * require changing to a neutral state first before activating the new > + * method. > + * > + * Disabling the direct touch mode on a device that does not have a direct > + * touch mode always succeeds. Should it be specified what happens to active gestures (including two finger scroll, and tap) etc? Do they continue, or are they cancelled in some way? I don't think a frame event is needed here. The history of that is a bit blurry anyway, AIUI the frame stems largely from the MT protocol A in Qt, but with the protocol B the frame event is to some extent superfluous (still useful, but just not as needed). This interface is aimed primarily at single-finger interactions, so i think a frame event is unnecessary here though it could be added later. Renaming to libinput_event_touchpad sounds good, I'll do that in the next revision. (In reply to Jonas Ådahl from comment #8) > @@ +4954,5 @@ > > + * require changing to a neutral state first before activating the new > > + * method. > > + * > > + * Disabling the direct touch mode on a device that does not have a direct > > + * touch mode always succeeds. > > Should it be specified what happens to active gestures (including two finger > scroll, and tap) etc? Do they continue, or are they cancelled in some way? That's what the sentence just above should've clarified, unless the device is in a neutral state the mode won't activate yet. so any current ongoing gestures have to end before the mode is activated. I'll try to add/reword this Fwiw, the API makes sense to me. I first thought it's unfortunate to have only get_x/y (thus millimeters) if we want to map this somewhere on screen, but we need to deal with millimeters anyway because we don't know the device aspect ratio, so it's probably best to keep it as is. fwiw, the aspect ratio is available through libinput_device_get_size(). I'm intentionally not providing the transformed_x/y calls though because as you said this needs to be handled in as-is physical coordinates, otherwise we'll just get squashed input. (In reply to Peter Hutterer from comment #11) > fwiw, the aspect ratio is available through libinput_device_get_size(). I'm I know :). My train of thought was that, even if you have get_x/y_transformed calls, you still need this call in order to pass a proper width/height to those, so the whole thing would rely on millimeters after all. It's not like touchscreens where you can more or less assume a correlation between input and output ratios. Anyway, we're agreeing :P I've pushed a minimal implementation to this tree here which I'll keep updating as we go along. https://github.com/whot/libinput/tree/wip/touchpad-logograms There are few bits lacking (buttons, hysteresis) but it should be enough to get some PoC working on top of it. Closing as moved. We roughly know what we want to do but the patch itself hasn't been touched (or tested?) in a year, so let's re-open this when we have an actual demand for it. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.