There is no true multitouch on the synaptics driver for Linux. The two-fingered scrolling works, but I am unable to touch one area of the pad and move the cursor with another finger. This makes dragging and dropping extremely difficult. The HP site calls the clickpad/trackpad an "HP Imagepad with on/off button and precision multi-touch gesture support." I am running Arch Linux x64 with xf86-input-synaptics 1.6.2 Here is my laptop: http://www.notebookcheck.net/HP-Envy-15-3040nr-Laptop-Review.72051.0.html Official page: http://h10025.www1.hp.com/ewfrf/wc/document?cc=us&dlc=en&docname=c03130023&lc=en&product=5218394 Here is a video where I attempt to demonstrate the issue: http://youtu.be/-F84TOXtAF8
Please attach your Xorg.log.
Created attachment 67211 [details] Here you go
ok, so the clickpad is recognised but re-reading your initial comment I think you're just trying to do something that's not implemented. Drag&drop works if you press the physical button and use the second finger to move, you can do tap-and-drag as well, but tapping with one finger and moving with the other is simply not implemented (yet).
That's what I thought. Thank you for looking into this. Is there any timeframe for when this will be implemented?
not really, sorry. unlikely to be implemented, unless you do it yourself.
:( How come?
lack of manpower. that's pretty much it. also, I suspect this feature is rather difficult to implement given the hardware data we get and the state of the driver in general. So getting this right on all touchpads is hard, and not breaking any other features will require extensive testing.
I notice that when I uninstall xf86-input-synaptics, I get my desired actions. However, this is not because of multitouch, because that still lacks when there is no driver. When no driver is loaded, the bottom part of my clickpad is not detected as a touch area. It is still detected as clickable, but not for touch. This way, it acts as a traditional touchpad with physical buttons. This is how my clickpad acts on windows 7. http://i48.tinypic.com/20f3pch.png http://www.blogcdn.com/www.engadget.com/media/2010/05/hppavilionhands-on05.jpg The touch is my clickpad. The bottom is a popular clickpad used by HP laptops. For my clickpad specifically, I want the bottom part to not be touch-detected. Is there a way to detect these areas out-of-the-box using the synaptics driver? Is there some configuration to achieve this?
(In reply to comment #8) > I notice that when I uninstall xf86-input-synaptics, I get my desired > actions. However, this is not because of multitouch, because that still > lacks when there is no driver. if you uninstall the synaptics driver, the evdev driver will take over your touchpad. evdev has some very basic touchpad features, but that this particular feature then work for you is by accident, not by design. > When no driver is loaded, the bottom part of my clickpad is not detected as > a touch area. It is still detected as clickable, but not for touch. tbh, that is rather odd given that evdev has no such feature. > The touch is my clickpad. The bottom is a popular clickpad used by HP > laptops. > For my clickpad specifically, I want the bottom part to not be > touch-detected. > Is there a way to detect these areas out-of-the-box using the synaptics > driver? > Is there some configuration to achieve this? not out of the box, but try setting the AreaBottomEdge option (see the synaptics man page). that should get you most of the way there
I found a driver called xf86-input-mtrack It gives me the feature that I need, but it's meant for an Apple trackpad. Is there a way to check what it is in the mtrack driver that can be merged into the synaptics one?
have a look at the source and see what you can find. synaptics has the (dis)advantage of working for a whole range of models, so any feature merged across would have to keep the other models working. given that mtrack is GPL, we can't merge code directly though.
What the current situation regarding this bug? Having a clickpad that behaves as one large button makes certain laptops border on unusable.
this feature is not being worked on atm,
The issue I am having might be similar. On the Dell XPS 13 there is a click pad where the touchpad covers the physical buttons. When I set AreaBottomEdge to the correct value, moving one finger into that zone doesn't move the pointer, so far so good. BUT if I rest my finger on the physical left button (left hand), which is inside the dead zone, that finger is still registered by the driver and taken into account as a multitouch gesture: if I move another finger (right hand) in the live zone up and down, it will scroll up/down (because I have the two finger scroll enabled). Disabling two finger scrolling doesn't help, the second finger (right hand) cannot move the pointer. This is annoying because I was used to have physical buttons on my old laptop and always had a finger resting on the button. One possible fix for this would be to make the AreaBottomEdge / dead zone really dead, which means that any finger touch/rest inside that zone must be completely ignored. What do you think ?
I agree with Vincent, making the dead area truly dead would significantly improve the usability. Where should we look to try and add this feature?
No-one has taken this up in over 2 years and synaptics is in maintenance mode. Closing as WONTFIX, sorry libinput does proper multitouch now though
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.