GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces
Complex Systems Informatics and Modeling Quarterly 2016
Otto Parra, Sergio Espana, Oscar Pastor

Among the technological advances in touch-based devices, gesture-based interaction have become a prevalent feature in many application domains. Information systems are starting to explore this type of interaction. As a result, gesture specifications are now being hard-coded by developers at the source code level that hinders their reusability and portability. Similarly, defining new gestures that reflect user requirements is a complex process. This paper describes a model-driven approach to include gesture-based interaction in desktop information systems. It incorporates a tool prototype that captures user-sketched multi-stroke gestures and transforms them into a model by automatically generating the gesture catalogue for gesture-based interaction technologies and gesture-based user interface source codes. We demonstrated our approach in several applications ranging from case tools to form-based information systems.


Keywords
Model-driven architecture; gesture-based interaction; multi-stroke gestures; information systems; gesture-based user interface
DOI
10.7250/csimq.2016-6.05
Hyperlink
https://csimq-journals.rtu.lv/article/view/csimq.2016-6.05

Parra, O., Espana, S., Pastor, O. GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces. Complex Systems Informatics and Modeling Quarterly, 2016, No.6, pp.73-92. e-ISSN 2255-9922. Available from: doi:10.7250/csimq.2016-6.05

Publication language
English (en)
The Scientific Library of the Riga Technical University.
E-mail: uzzinas@rtu.lv; Phone: +371 28399196