Multi Touch
Multi Touch
Multi Touch
Multi-Touch
v.8.20
© 2020 Ing. Punzenberger COPA-DATA GmbH
Distribution and/or reproduction of this document or parts thereof in any form are permitted solely
with the written permission of the company COPA-DATA. Technical data is only used for product
description and are not guaranteed properties in the legal sense. Subject to change, technical or
otherwise.
Contents
2 Multi-Touch ................................................................................................................................................. 4
3 Gestures ........................................................................................................................................................ 6
GENERAL HELP
If you cannot find any information you require in this help chapter or can think of anything that you
would like added, please send an email to [email protected].
PROJECT SUPPORT
You can receive support for any real project you may have from our customer service team, which
you can contact via email at [email protected].
2 Multi-Touch
With zenon, touch screens can also be operated with Multi-Touch gestures.
You can obtain an example project (on page 18) from your zenon consultant.
4 | 29
Multi-Touch
Attention
Two-hand operation in Multi-Touch can be implemented with two buttons: One
button triggers the block, the other executes the action.
The following must be the case for this:
Both pressure points each configured in their own screen
Both screens based on their own frame
REQUIREMENTS
The following is recommended for Multi-Touch:
Activation of DirectX Hardware for the zenon property Graphical design/Graphics quality
In doing so, note the recommended minimum requirements from the System requirements
when using DirectX chapter.
CONFIGURING MULTI-TOUCH
To use Multi-Touch:
1. Deactivate the Windows CE project project property
2. Configure Multi-Touch in:
a) project properties
b) screen properties
c) properties of the dynamic elements
5 | 29
Gestures
Attention
In the Control Panel of Windows 8, the visualization of the finger and pen input
can be influenced globally:
If the visualization is deactivated, there is also no visualization in zenon,
regardless of what has been configured.
If "Optimize visual feedback for the output to an external monitor" is
used, then the visual feedback is enhanced and always displayed in zenon,
regardless of the settings in the project properties.
Default: Visual feedback is activated but no enhanced. The behavior can also
be set with zenon.
In the project, you can find configuration options for Multi-Touch with the following properties:
For screens, in the groups:
Interaction
Programming interface
VSTA gesture recognition
For dynamic elements in the groups:
VSTA gesture recognition
Runtime/Tap and hold
3 Gestures
With zenon Multi-Touch, you have all Windows based gestures available. You can see the amount of
input points that your touch system provides in the system properties in the System area.
WINDOWS 7 GESTURES
Selection of gestures that are often used in zenon:
Gesture Description Windows standard
6 | 29
Gestures
Press and hold Press the target and tap with a second Right click.
finger.
Tap and hold Press, wait for the ring animation, release. Right click.
Tap/Double tap Tap with one finger or tap twice quickly. Click/double click.
Zoom Moving two fingers away from one Zoom (Control key +
another. mouse wheel).
Two-finger tap Tap at the same time with two fingers. None.
The target is between the fingers.
WINDOWS 8 GESTURES
Selection of gestures that are often used in zenon:
Gesture Description Windows standard
Press and hold Press the target and tap with a second Right click.
finger.
Tap and hold Press, wait for the ring animation, release. Right click.
Tap/Double tap Tap with one finger or tap twice quickly. Click/double click.
Drag Place one finger on the object and drag To switch between
with the finger. screens or menus.
Select and move
objects.
7 | 29
Configuring the interaction
Zoom Moving two fingers away from one Zoom (Control key +
another. mouse wheel).
If the screen is touched with two fingers in Windows 8, the action depends on the elements that are
touched.
Prerequisite: In the project properties, the current operating system [Windows 8] must be selected
in the Interaction node for the Recognition property. zenon must be running on a Windows 8
computer in the Runtime.
8 | 29
Configuring the interaction
CONFIGURATION
Interactions can be configured for:
Screens
Dynamic screen elements
Various elements such as lists
The actions that can be assigned to gestures depends on the screen or the screen element that is to
be configured.
Information
Frames can also be moved with the mouse if the screen is not a worldview. To
do this, the Move Frame via mouse property must be activated. In the
Runtime, a left mouse click in a free area of the screen and then moving the
mouse with the mouse button held down moves the whole screen.
4.1 Reactions
Reactions to gestures can be individually configured for screens and screen elements, as well as
various elements such as lists:
1. The following are available in the Interaction group for supported screens:
Reactions to Tap and hold
9 | 29
Configuring the interaction
Reactions are used to define what happens when the respective gesture is detected on the screen or
the screen element in the Runtime.
Reaction: Selection of desired reaction from drop-down list. The reactions that are available
depend on the screen type/element.
Function: Selection of a function configured in zenon if Execute own function was selected
for Reaction.
Note: There is a significant difference between screens and screen elements with Execute own
function: Interlockings and user rights can also be configured for screen elements. This is not possible
for screens, because the screen does not have screen functions.
Screens
can be used for runtime environments that are not touch-compatible
are backwards-compatible: new gestures can also be supported and gestures can also
receive new settings
If a screen is copied, the respective properties that have been set are accepted.
4.1.1 Manipulation
Screens and certain screen elements such as lists can be manipulated with touch gestures. You can
define the desired reaction for a gesture in the Editor in Manipulation group for:
Move (only diagram window in Extended Trend)
Move horizontally
Move vertically
Zoom
Whether screens or screen elements are manipulated depends on the setting of the configuration of
the size:
Use screen size from frame property active: The screen is manipulated (moved, zoomed).
10 | 29
Configuring the interaction
Worldview: The respective active element in the screen is manipulated provided the element
supports this.
Definition of worldview: Use screen size from frame property is inactive and the screen is
larger than the frame.
ZOOM
A screen can only be zoomed within the limits that have been set for the following properties:
Width (maximum) [pixels]
Height (maximum) [pixels]
Breite(Minimum) [Pixel]
Höhe(Minimum) [Pixel]
If a limit has been reached when zooming in the Runtime, then an attempt is made to continue
zooming in the free directions. The page ratio is taken into account in the process.
In the Extended Trend module, in addition to the window, the curve graphics can also be zoomed
into with a two-finger gesture.
11 | 29
Configuring the interaction
FACEPLATES
With faceplates, both the faceplate screen and each screen container have their own gestures for
manipulation. Gestures have an effect on the screen container if the faceplate screen is not a
worldview.
12 | 29
Configuring the interaction
Information
The available reactions depend on the screen type and screen element.
AML
The list in the AML screen aids as Reaction for Double tap:
No reaction
Execute own function
Acknowledge alarm: for selected alarms
Execute alarm: for selected alarms
Help for executing alarm: for selected alarms
Stop/continue list: independently of alarms.
Double tapping on a list entry always leads to this being selected and the corresponding function
being executed. Double tapping in an area outside the list entries only leads to independent
functions, but no alarm-specific functions.
EXTENDED TREND
The diagram window in an Extended Trend screen aids as Reaction for Double tap:
No reaction
Execute own function
Zoom to 100 %
Step back
The curve list in an Extended Trend screen aids as Reaction for Double tap:
No reaction
Edit
13 | 29
Configuring the interaction
4.1.3 Tap
SCREEN ELEMENTS
The following are available for screen elements as Reaction on Tap:
No reaction
Execute own function
Selection
Information
The available reactions depend on the screen type and screen element.
14 | 29
Turn off feedback when scrolling
Solution: This Windows action can be switched off. Windows version 8.1 or higher is required for
this.
Note: The Windows feedback is thus turned off completely and also no longer works in other
applications.
6 Evaluating events
Events that are to be evaluated via the programming interface can be defined in detail for screens.
The evaluation is carried out via VSTA at screen level. To configure the evaluation of events: the
Recognition property in the project properties for Interaction must be set to Windows 8
1. go to group Programming interface
2. go to subgroup Multi-Touch events
3. Select the desired property from the drop-down list of the Raw data event routing
property:
All events: all events are evaluated
Deactivated: the evaluation is deactivated
Only selected events: only the events activated via checkboxes are evaluated
15 | 29
VSTA gesture detection
PointerUp
PointerUpdate
PointerWheel
PointerHWheel
PointerDeviceChange
PointerDeviceInRange
PointerDeviceOutOfRange
NCPointerDown
NCPointerUp
NCPointerUpdate
PointerActivate
PointerCaptureChanged
You can find details about events in the Object model section or at the Microsoft Help for MSDN
(http://msdn.microsoft.com/en-us/library/hh454903(v=vs.85).aspx
(http://msdn.microsoft.com/en-us/library/hh454903(v=vs.85).aspx)).
The properties are only available if the Recognition property is set in the project properties for
Interaction on Windows 8.
The selected configuration is available in Runtime and can be edited via VSTA.
You can find details about events in the Object model section or at the Microsoft Help for MSDN
(http://msdn.microsoft.com/en-us/library/windows/desktop/hh448838(v=vs.85).aspx
(http://msdn.microsoft.com/en-us/library/windows/desktop/hh448838(v=vs.85).aspx)).
16 | 29
VSTA gesture detection
SCREENS
You can find the properties for the VSTA gesture detection for screens in the VSTA gesture
recognition properties group of the screen. As soon as the Gesture recognition active property
was activated, the following gestures are available for selection:
Manipulation
Exact
Translation X
Translation Y
Rails X
Rails Y
Translation inertia
Rotation
Rotation Inertia
Scaling
Scaling inertia
Cross slide
Cross slide horizontal
Cross slide select
Cross slide speed bump
Cross slide rearrange
Cross slide exact
Tap
Tap double
Secondary tap
Drag
Hold
DYNAMIC ELEMENTS
You can find the properties for the VSTA gesture detection for dynamic elements in the VSTA
gesture recognition properties group of the element. As soon as the Gesture recognition active
property was activated, the following gestures are available for selection:
Manipulation
17 | 29
Example project for Windows 7
Exact
Translation X
Translation Y
Rails X
Rails Y
Translation inertia
Rotation
Rotation Inertia
Scaling
Scaling inertia
Cross slide Cross slide horizontal
Cross slide select
Cross slide speed bump
Cross slide rearrange
Cross slide exact
Tap
Tap double
Secondary tap
Drag
Hold
18 | 29
Example project for Windows 7
START PAGE
The start page displays an overview of the complete production line. Several equipment Icons are
visible at the same time. You can scroll to other equipment using gestures. Tap on an Icon changes to
the selected equipment. The following is also available in the screen:
Alarm line at the top edge: Displays the last alarm of the complete production line. You can
drag out the alarm line. This will display the whole Alarm Message List.
Login button: Makes it possible to log in different users.
Exit button: Closes the Runtime and can only be operated by users with administrator rights.
NAVIGATION
In the lower screen area, the navigation depicts the whole production line with the help of Icons in a
horizontal scroll area. In addition an energy worldview is available. It is selected via the button located
at the lower center. A machine is selected with a Tap on a visible Icon. In this project only the
equipment Filler can be selected. If you press and hold the Filler Icon long enough, a Glow effect is
displayed. The list can be scrolled using a Swipe gesture; a Tap on the scrolling list stops this again.
The scroll speed is determined via the acceleration of the Drag movement:
Slow: follows the finger
Faster: follows quickly behind
The navigation is centered on the Filler Icon when the start screen is called up.
ALARM LINE
At the top edge of the screen an alarm line is located. It displays the last alarm of the complete
production line. It can be opened to display the Alarm Message List.
Operation:
Open:
A Tap on the line opens the AML up to the half of the screen.
Via gestures the AML can be customized to an individual size.
Close:
A Tap outside of the frame closes the opened AML.
You can also move up the AML manually.
19 | 29
Example project for Windows 7
LOGIN SCREEN
The Login screen offers a gesture-based login in the style of Windows 8. Before you enter a
password, you must select a user by means of a Tap. After that you can start entering the password
for the selected user via Hovering. For example:
Administrator:
Maintenance:
Operator:
In addition there is a logout button which logs out the currently logged in user and opens the login
screen. The login screen is a modal dialog which dims the background.
20 | 29
Example project for Windows 7
In addition, there is the Workspace concept with freely positionable windows, which are stored in a
Dock when they are not used. The Workspace spans several screens to which you can change via
Swipe gesture, Tab navigation or navigation button. In the lower area there is a activation area for
two-hand operation and a home button. At the right top edge there is an operable display for the
workspace.
DOCK
Icons can be dragged from the dock to the workspace where they are then displayed as a faceplate in
a defined base size. If a faceplate is placed on the workspace, its icon is shown as deactivated. If a
Faceplate is closed, its icon is displayed as activated again. Tap & hold on a deactivated icon pinpoints
an open faceplate and jumps to the Workspace used by it.
WORKSPACE
On any of the four freely-definable workspaces, as many faceplates as desired can be positioned and
scaled for each person.
Move faceplate to the vertical screen edge: After a delay of 2 seconds, a change to the next
Workspace is carried out and the faceplate can be positioned freely.
Move faceplate to the vertical screen edge (swipe gesture): The faceplate is moved to the
next workspace; the current workspace remains open.
Remove/close Faceplate: Drag faceplate to the dock or by a downwards swipe gesture.
As an option, each Faceplate can be closed via the X button, which is located at the top right
corner.
The called up Faceplates, their position, size, etc. are saved in the user profile. A faceplate can be
enlarged or reduced by means of a zoom/pinch gesture. Each faceplate can also be moved. A
selected faceplate is moved to the foreground via Z-Order manipulation but always remains behind
the alarm line. On the next login, the position and size data of the individual faceplates are read and
they are positioned accordingly.
TWO-HAND OPERATION
In the bottom left corner there is an activation area for two-hand operation. If a locked element is
actuated, the activation area flashes and a locked element can be unlocked via this area.
21 | 29
Example project for Windows 7
Input set value: Keyboard is called up and set value can be entered,
Jog operation
A consideration of the activation order (activation before action) is engineered in the demo project.
During jog operation (faceplate operation), the active activation is constantly checked.
CIRCLE MENU
The circle menu was implemented for faster navigation between workspaces. It is activated using
Tap&Hold anywhere on the Workspace and is displayed around the finger touching the screen. The
workspace is selected by dragging the the finger in one of the areas. The switch is carried out when
the finger leaves the screen. The action can be canceled by dragging outside or inside the menu area.
HOME BUTTON
There is a Home button in the right bottom corner. With the help of the home button you can
switch to the start screen. The Home button can only be activated with two-hand operation.
CLASS DESCRIPTION
MULTITOUCHMANAGEMENT
Complete handling of the whole Multi-Touch application. At creating the MultitouchManagement class, the
classes LoginWindow, NavigationsWindow and WindowManagement are instanced.
LOGINWINDOW
In this class the important component of the user login and the password pattern recognition are
included.
NAVIGATIONWINDOW
Handles the "Icon" faceplate positioning screen and manages the whole opening process of the
Faceplates which are called up.
WINDOWMANAGEMENT
Is responsible for processing all touch events of all faceplates (move, scale, etc.). In addition, this class
takes care of saving and reading all faceplate information needed in Runtime.
22 | 29
Example project for Windows 7
WINDOWPROPERTIES
For each Faceplate, an independent instance is instanced, which provides all necessary data of the
Faceplates. In terms of data retention, all instances are saved in an XML file when Runtime is closed
and can therefore provide the last valid settings of the faceplates when Runtime is started.
EVENTS
If you activate project setting Multi-Touch active, you get the events for TouchManipulationStartEvent,
TouchManipulationDeltaEvent, and TouchManipulationCompleteEvent for corresponding event handler declaration in
the Runtime. Via method SetupTouchInertia you can define the inertial parameter for each screen. The
following display shows a schematic process of the fired events:
The handed over parameter IContacts gives back the number of the fingers currently on the screen.
After finishing the touch gesture the inertia values are calculated by the inertia processor via the
values handed over by method SetupTouchInertia and finished via TouchManipulationDeltaEvents and the
calculated inertia values with a singular TouchManipulationCompleteEvent.
As no finger is on the screen during the calculation of the inertia values, the handed over IContacts
parameter is 0. The number of the TouchManipulationDeltaEvents needed by the inertia processor depends
on the handed over parameters by method SetupTouchInertia. Depending on the inertia the inertia
processor needs more or less events to finish the gesture.
If during the firing of the calculated inertia events another gesture is started, no additional
TouchManipulationDeltaEvents come from the old gesture. After a concluding TouchManipulationCompleteEvents,
the events for the new gesture are started immediately via a TouchManipulationStartEvents.
23 | 29
Example project for Windows 7
gestures in the engineered worldview. For more information see chapter: Navigation with
Multi-Touch in the worldview (on page 26).
8.4 FAQs
Frequently asked questions and practical answers.
Note: The properties Name for object list and Help chapter can be used in zenon as freely
definable properties.
To call up the respective screens, you also must engineer a screen switch function. The name of the
screen switch function consists of the prefix "scr" and the name of the Faceplate. If this name
convention is observed, the complete handling is then managed by the Multi-Touch Management.
At engineering the interlocking must be linked with enableArea for two-hand operable elements. This
engineering makes sure that feedback is automatically generated provided the Enabler (two-hand
operation) is not pressed. This element is only operable if the Enabler is pressed.
24 | 29
Example project for Windows 7
Group tab_inactive:
property Name for object list must contain an entry according to the following pattern:
demoCurrentWorkspace|PAGE (for example: demoCurrentWorkspace|3)
During the input of the password pattern a typographic password is created in the background which
can be compared with the engineered project users via a function. The first square is interpreted as
'A', the second as 'B', etc. and put together to a coherent password via line-dependent hovering of
the squares.
In the circle menu, you have the possibility to switch between the workspaces to the desired
workspace by dragging your finger.
25 | 29
Navigation with Multi-Touch in the worldview
RULES
Move: If a screen in a container is not a worldview, it accepts the settings of the faceplate
screen.
26 | 29
Navigation with Multi-Touch in the worldview
MOVE THE FRAME OR BORDER WITH THE MOUSE IF THE SCREEN IS A WORLDVIEW AND
THE SAME SIZE OR SMALLER THAN THE FRAME:
With the right mouse button: No reaction.
With the left mouse button: Frame is moved.
PROJECT CONVERSION
Values for Move horizontally and Move vertically when converting from an earlier version to
zenon 7.20:
Screen is bigger than the frame: Move.
Screen is the same size or smaller than the frame: No reaction.
ZOOM AND SCROLL VIA PROPERTY MULTI-TOUCH FOR ZOOM AND SCROLL
To use Multi-Touch without VBA/VSTA:
1. In the project properties in the Interaction node for the Recognition property, activate
Windows 7.
2. Deactivate property Use screen size from frame in node Frame at the properties of the
screen
3. Activate property Multi-Touch for zoom and scroll in node Interaction at the properties
of the screen
With this you can scroll and zoom in the screen with touch operation using Multi-Touch gestures.
With this VBA/VSTA for zooming and scrolling is deactivated.
27 | 29
Navigation with Multi-Touch in the worldview
Style
(
SetZoomAndPos float ZoomX, float ZoomY, int ZoomLevel, int CursorX, int CursorY, int PosX, int PosY, int PosMode) :
ZoomX -> New zoom factor X direction; if not used, set to 0
Attention: ZoomX, ZoomY and ZoomLevel can never be used simultaneously. Either you enter a ZoomLevel or a
zoom factor for x and y axis.
28 | 29
Navigation with Multi-Touch in the worldview
For the move gesture, you can define the direction - horizontal, vertical or both. To do this, use the
Horizontal verschieben and Vertikal verschieben properties.
29 | 29