The UX (User eXperience, look & Feel) of the applications is very important on the mobile platforms. This can be improved by a better design, better screen navigation and also how the information is organized throughout the whole app. But there is also a key feature that the users of Smart Devices expect, that is the applications react to what they do on their devices.
In touchscreen devices, the way to input information or commands is done by interacting with the screen.
These inputs are called gestures and the applications have to take them into consideration in order to provide a good UX.
Touch briefly surface with the fingertip.
Touch rapidly the surface twice with the fingertip.
Touch the surface for an extended period with the fingertip.
- Drag and Drop
Touch the surface with the fingertip without losing contact until reach the goal.
Brush quickly the surface to the right with the fingerprint.
Brush quickly the surface to the left with the fingerprint.
Brush quickly the surface up with the fingerprint.
Brush quickly the surface down with the fingerprint.
- Attribute (read-only)
- Variable (read-only)
All these gestures are encoded in GeneXus as events related to screen controls.
The first mechanism is by right-clicking on the user control (previously dragged from the toolbox to the layout).
This action will display a contextual menu (as it is shown below) with every possible event associated with that control.
Once the developer selects one of this options, automatically will be redirected from the Layout tab to the Event tab and the selected event associated with the control (in this case a TextBlock named "Textblock1") will be ready to be filled.
Msg("Textblock Tap Event Executed")
As an alternative mechanism, the developer can be positioned on the Event tab and starts writing the event by itself with a suggest-menu.
This menu will be displayed when the developer writes the Control Name of the control following by a dot (e.g. 'Texblock1.').
Some of the objects that support touch events are containers; other are non-editable controls. There is a special consideration when a control and container have touch events.
Suppose in a layout we have a table control, named MainTable and an image control, named MyImg, inside the MainTable control (i.e. MainTable is a container).
Furthermore, for MainTable we have codified the Tap and Swipe events. And for MyImag we have LongTap and Swipe.
If a Tap Event is executed on MyImg (as it doesn't handle it) the touch event is propagated to its container following an hierarchically order until one container is found which handles tap event. In our case the Tap event of the MainTable will be executed.
If a swipe event is performed to MyImg the only code executed will be the one defined for MyImg.Swipe as the event was executed this is not propagated to its container.
- For Android swipe up and swipe down events are not supported.
- Android touch events propagation: If a Control (C) does not has any touch event defined he will propagate the events to its container. If C has one or more touch events defined it will not propagate the event even though it s container may handle those events.
In order for the user to receive a feedback for when executing a gesture from GeneXus Evolution 2 Upgrade 3, the Tap, DoubleTap and LongTap gestures are going to trigger the highlighted properties of the control. That is: if you tap on a control, its "Highlighted Background" class property will be considered to give feedback to user.