This page describes the state of touch input and events on all tier-1 platforms. On this page "input" will refer to the input events that Gecko receives from the OS, and "events" will refer to DOM events dispatched into web content.
In general touch input received from the user can be dispatched to web content in the form of (a) touch events (b) pointer events and (c) mouse events.
Touch events are covered by . In general these are widely used on the web but have main thread dependencies which makes them unsuitable for async touch-based pan/zoom. For backwards compatibility we need to support these but going forward we would like to use pointer events instead and deprecate touch events.
Pointer events were proposed by Microsoft. Initially it had support from Google but then they changed their mind and as of this writing they do not plan to support it. Microsoft is trying to push pointer events and people from MS Open Tech are actively implementing pointer events support in Gecko. In Gecko we would like to have pointer events and would also like Google to support them for better interoperability. There is some code in nsPresShell in Gecko which can generate and dispatch appropriate pointer events based on touch/mouse input.
Mouse events are also dispatched for backwards compatibility with non-touch-aware web content. The only cases where mouse events need to be dispatched to content are when the user does a "tap" and the resulting touch events are not prevent-defaulted by web content. In this case three mouse events (mousemove, mousedown, mouseup) need to be dispatched to the point of the tap. A fourth event (click) is created by the EventStateManager from the mousedown/mouseup pair. Note that there must be some sort of gesture detection that happens somewhere in order to determine that a sequence of touch input corresponds to a "tap".
On B2G we receive only touch input from the OS. The B2G widget code converts the touch input into touch events and dispatches it to web content. After that there are different code paths that take effect for the parent process and the child process.
- On the parent process, if the touch events are not prevent-defaulted, the B2G widget code (GeckoTouchDispatcher::DispatchMouseEvent) will dispatch mouse events corresponding to the touch input. Note that in this case there is no tap gesture detection - all of the touch input triggers mouse events in the widget. Bug 1005815 covers this problem.
- On the child process, if the touch events are not prevent-defaulted, the APZ code (or code in TabChild if APZ is disabled) will run gesture detection code. If the touch input correspond to a tap, it will trigger dispatch of mouse events as required.
Pointer events on B2G are not enabled, but can be created and dispatched from nsPresShell based on the incoming touch input. The B2G widget code should not need to deal with pointer events at all.
On Android we receive only touch input from the OS. The Android widget code converts the touch input into touch events and dispatches it to web content. The Java pan/zoom controller does gesture detection on incoming touch input and assuming the touch events are not prevent-defaulted, notifies the Fennec browser.js code of a "tap" event. The browser.js code then dispatches the mouse events as required.
Pointer events on Android are not enabled, but can be created and dispatched from nsPresShell based on the incoming touch input. The Android widget code should not need to deal with pointer events at all.
On Windows Desktop we receive touch input as well as fallback mouse input from the OS. Touch events are currently disabled on Windows Desktop, so the touch input is discarded in the Windows widget code. Gesture detection for taps is done in the OS and the widget code is notified in the form of fallback mouse input. This is handled like any other mouse input in Gecko and passed through to web content. Since the Windows desktop code doesn't send touch events, we don't have to worry checking for prevent-defaulted touch events.
The code to generate pointer events from mouse/touch input in nsPresShell can be enabled on Windows Desktop. However, since touch input is not passed from the widget to gecko, no corresponding pointer events would be generated. The options for having pointer events on Windows Desktop are to either (1) start forwarding the touch input from widget to gecko or (2) add extra code in the widget code to generate pointer events directly.
On Windows Metro we receive touch input as well as fallback mouse input from the OS. Touch events are generated in the Windows Metro widget code and dispatched to web content. Gesture detection for taps is done in the OS and the widget code is notified in the form of tap inputs. The widget code then generates mouse events using these input events, if the touch events were not prevent-defaulted.
The code to generate pointer events from mouse/touch input in nsPresShell can be enabled on Windows Metro. Since touch input is passed from the widget to Gecko this should Just Work (TM).
No touch event support in the OS X widget code. Any touch events delivered by the OS are dropped.
GTK+2 does not provide specific support for handling touch input. It may be possible to handle touch input through filters for X11 events, but this may be non-trivial.
GTK+3 provides support for touch input. Returning TRUE (synchronously) to a touch-event GtkWidget signal handler indicates that the touch input is handled, which will inhibit the corresponding signal (button-press for BEGIN, motion-notify for UPDATE, or button-release for END) for emulated mouse input. Gecko support is not yet implemented, tracked in bug 978679.
mbrubeck has a page at Gecko/Touch which tracks various other aspects of touch events.