• Title/Summary/Keyword: Gesture interface markup language

Search Result 4, Processing Time 0.023 seconds

A Gesture Interface Description Language for a Unified Gesture Platform

  • Geun-Hyung Kim;EunJi Song
    • Asia-pacific Journal of Convergent Research Interchange
    • /
    • v.4 no.2
    • /
    • pp.1-12
    • /
    • 2018
  • Nowadays, the advent of smart devices equipped with the latest input technologies has changed the way users interact with smart devices. The gesture based user interface, as the natural user interface technologies, has attracted a lot of attention from researchers and developers. Gestures can be constituted in different ways; touching a screen, moving a pointing device, or making hand or body movements in a three-dimensional (3D) space. The various gesture input devices make application developers to maintain multiple source code families for the same applications adapting different gesture input devices. In this paper, we defined the gesture interface markup language (GIML) based on extensible markup language (XML) to describe gestures independently of the input devices. It also provides constraints necessary to determine which gesture has occurred and information required when UGesture platform interact with the gesture based application. The proposed GIML is based on our previous implemented the UGesture platform and the evaluation results, and so the GIML can be used to define new gestures for the UGesture platform and support new input hardwares.

Design of Gesture based Interfaces for Controlling GUI Applications (GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계)

  • Park, Ki-Chang;Seo, Seong-Chae;Jeong, Seung-Moon;Kang, Im-Cheol;Kim, Byung-Gi
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.1
    • /
    • pp.55-63
    • /
    • 2013
  • NUI(Natural User Interfaces) has been developed through CLI(Command Line Interfaces) and GUI(Graphical User Interfaces). NUI uses many different input modalities, including multi-touch, motion tracking, voice and stylus. In order to adopt NUI to legacy GUI applications, he/she must add device libraries, modify relevant source code and debug it. In this paper, we propose a gesture-based interface model that can be applied without modification of the existing event-based GUI applications and also present the XML schema for the specification of the model proposed. This paper shows a method of using the proposed model through a prototype.

W3C based Interoperable Multimodal Communicator (W3C 기반 상호연동 가능한 멀티모달 커뮤니케이터)

  • Park, Daemin;Gwon, Daehyeok;Choi, Jinhuyck;Lee, Injae;Choi, Haechul
    • Journal of Broadcast Engineering
    • /
    • v.20 no.1
    • /
    • pp.140-152
    • /
    • 2015
  • HCI(Human Computer Interaction) enables the interaction between people and computers by using a human-familiar interface called as Modality. Recently, to provide an optimal interface according to various devices and service environment, an advanced HCI method using multiple modalities is intensively studied. However, the multimodal interface has difficulties that modalities have different data formats and are hard to be cooperated efficiently. To solve this problem, a multimodal communicator is introduced, which is based on EMMA(Extensible Multimodal Annotation Markup language) and MMI(Multimodal Interaction Framework) of W3C(World Wide Web Consortium) standards. This standard based framework consisting of modality component, interaction manager, and presentation component makes multiple modalities interoperable and provides a wide expansion capability for other modalities. Experimental results show that the multimodal communicator is facilitated by using multiple modalities of eye tracking and gesture recognition for a map browsing scenario.