I have a touch screen which generates a plethora of event types according to libinput-debug-events (including what seems to be multi-touch events). I can even see distinct events for the digitizer pen.
However, further up the stack, all this is gone. I don't get any two-finger scrolling, for instance. There is no right-button emulation in Firefox, either. I can still generate primary button click events and (single-finger) drag events, but that's about it.
With Wayland, the pen appears to be completely dead. With X, I get some reaction to the pen in Xournal (some pointer indication moves), but Xournal does not put any ink on the screen.
How can I help to improve matters in this area?
This is with current Fedora rawhide and the 4.9.9 kernel (4.11 seems to have Wifi issues).
Hi Florian,
I have a touch screen which generates a plethora of event types according to libinput-debug-events (including what seems to be multi-touch events). I can even see distinct events for the digitizer pen.
However, further up the stack, all this is gone. I don't get any two-finger scrolling, for instance. There is no right-button emulation in Firefox, either. I can still generate primary button click events and (single-finger) drag events, but that's about it.
Handling of touch events is up to the application. Besides the minimal single-touch to pointer event emulation (which only applies to X11/Xwayland) there are no emulation layers making this up for applications that don't know about touch events themselves.
GTK+3 provides built-in support for 1 finger touch scrolling, text edition popovers, text selection handles and other touch features. App can nonetheless be developed in touch unfriendly ways, so some some attention is needed throughout the stack. For example the gnome HIG largely avoids popup menus, even more for stuff that is inaccessible otherwise.
Other apps/toolkits/etc have to actively take care of touch support themselves. Firefox, despite "using" GTK+, implements itself large parts of their UI, if they don't observe touch input, it's pretty much for them to make use of it too.
It's worth pointing out that, at least as far as gtk3 is concerned, you are expecting the wrong gestures, scrolling happens on 1 finger input, the "long press" gesture doesn't necessarily map to "right click", etc.
With Wayland, the pen appears to be completely dead. With X, I get
Can you interact with gnome-shell with it? or is it entirely dead?
some reaction to the pen in Xournal (some pointer indication moves), but Xournal does not put any ink on the screen.
Not entirely sure what's up with X. As xournal is a GTK+2 application it requires Xwayland support for it on wayland. There are already patches in Xorg patchwork adding this missing bridging for tablet events.
How can I help to improve matters in this area?
You should perhaps file/ping bugs regarding touch support in your favorite apps, and/or wait for ports to GTK+3.
Cheers, Carlos
* Carlos Garnacho:
Hi Florian,
I have a touch screen which generates a plethora of event types according to libinput-debug-events (including what seems to be multi-touch events). I can even see distinct events for the digitizer pen.
However, further up the stack, all this is gone. I don't get any two-finger scrolling, for instance. There is no right-button emulation in Firefox, either. I can still generate primary button click events and (single-finger) drag events, but that's about it.
Handling of touch events is up to the application. Besides the minimal single-touch to pointer event emulation (which only applies to X11/Xwayland) there are no emulation layers making this up for applications that don't know about touch events themselves.
Wow. Doesn't that make it rather unlikely that touch behavior across applications remains consistent?
Other apps/toolkits/etc have to actively take care of touch support themselves. Firefox, despite "using" GTK+, implements itself large parts of their UI, if they don't observe touch input, it's pretty much for them to make use of it too.
Firefox turns on touch support only when the per-tab subprocess mode is active (i.e., browser.tabs.remote.force-enable is set to true). This is not the default in Fedora.
This Firefox mode provides drag-scrolling (before, dragging only selects text), and it is also possible to open links in a new tab because a long touch on a link triggers the context menu.
However, this works only once. After creating a new tab in this manner, touch events only move the pointer as far as Firefox is concerned, but do not result in click events, so it is no longer possible to interact with the Firefox application. I'll file this as a bug (unless this is a known issue).
With Wayland, the pen appears to be completely dead. With X, I get
Can you interact with gnome-shell with it? or is it entirely dead?
Not sure. There is some pointer activity in response to the pen: a second pointer appears, but not necessarily at the tip of the pen. So it's not completely dead.
On Thu, Mar 02, 2017 at 08:46:11PM +0100, Florian Weimer wrote:
- Carlos Garnacho:
Hi Florian,
I have a touch screen which generates a plethora of event types according to libinput-debug-events (including what seems to be multi-touch events). I can even see distinct events for the digitizer pen.
However, further up the stack, all this is gone. I don't get any two-finger scrolling, for instance. There is no right-button emulation in Firefox, either. I can still generate primary button click events and (single-finger) drag events, but that's about it.
Handling of touch events is up to the application. Besides the minimal single-touch to pointer event emulation (which only applies to X11/Xwayland) there are no emulation layers making this up for applications that don't know about touch events themselves.
Wow. Doesn't that make it rather unlikely that touch behavior across applications remains consistent?
in the short-term, yes. in the long term we hope to be fine, mostly because there are only a few gestures that are easily discoverable and commonly used.
But in any case, our hands are tied because only the application has sufficient context to handle direct-touch correctly to e.g. distinguish between a two-finger scroll gesture or a "move two icons at once" gesture. This is different to a touchpad, where it's less ambiguous.
Other apps/toolkits/etc have to actively take care of touch support themselves. Firefox, despite "using" GTK+, implements itself large parts of their UI, if they don't observe touch input, it's pretty much for them to make use of it too.
Firefox turns on touch support only when the per-tab subprocess mode is active (i.e., browser.tabs.remote.force-enable is set to true). This is not the default in Fedora.
This Firefox mode provides drag-scrolling (before, dragging only selects text), and it is also possible to open links in a new tab because a long touch on a link triggers the context menu.
However, this works only once. After creating a new tab in this manner, touch events only move the pointer as far as Firefox is concerned, but do not result in click events, so it is no longer possible to interact with the Firefox application. I'll file this as a bug (unless this is a known issue).
With Wayland, the pen appears to be completely dead. With X, I get
Can you interact with gnome-shell with it? or is it entirely dead?
Not sure. There is some pointer activity in response to the pen: a second pointer appears, but not necessarily at the tip of the pen. So it's not completely dead.
Wayland has different graphics tablet handling than X, specifically it has a separate focus and can thus be provided a separate cursor. What you're seeing is default behaviour. The big problem right now is that XWayland doesn't support tablets yet (patches floating around on the list) so the events just disappear into nirvana and/or valhalla, whichever comes first.
However, the second pointer should be at the tip of the pen, if not we have a calibration issue - please file a bug for that.
Cheers, Peter
desktop@lists.fedoraproject.org