UsiGesture

Approach allowing designers, developers and gesture specialist to create user interfaces including pen-based gesture recognition.

Creating user interfaces integrating pen-based gesture recognition is a difficult and consist in many challenges such as creating the graphical user interface, selecting the appropriate recognition mechanisms (algorithms and datasets), incorporating this mechanism in the user interface, collaborating with an heterogeneous team and following the right workflow, etc.

UsiGesture is aimed at providing a contribution in the field of Engineering of Interactive Systems by supporting the work of engineers, programmers and designers during the elaboration of graphical user interfaces integrating pen-based gesture recognition on 2D surfaces. It proposes methodological guidance for incorporating pen-based gestures into graphical user interfaces through a structured method that consists of: a Concrete User Interface model supporting gestures, a step-wise approach based on this model, and a supporting software environment.

More specifically, UsiGesture enables gesture recognition mechanism selection thanks to a first tool allowing to build various datasets and perform comparative studies to assess the best algorithms to chose depending on the application context. UsiGesture proposes a meta-model describing the gestures and recognition mechanisms, directly linked to UsiXML concrete meta-model. Based on this meta-model, a second tool has been developed to let developers compose user interfaces with recognition components thanks to a wysiwyg approach, along with an approach built around this tool to activate a good workflow and collaboration between the different stakeholders involved in the process. Finally, UsiGesture describes the lessons learned through different experiments consisting in the creation of user interfaces integrating pen-based gesture recognition created with this approach.

Author

Publications