• Title/Summary/Keyword: Mobile haptic interface

Search Result 11, Processing Time 0.012 seconds

Practical Issues of Mobile Haptic Interface and Their Improvements (이동형 햅틱 장치의 실제적 문제점과 그 향상 방안)

  • Lee, In;Hwang, In-Wook;Han, Kyung-Lyoung;Choi, Oh-Kyu;Lee, Jin S.;Choi, Seung-Moon
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.390-395
    • /
    • 2009
  • In this paper, we present practical issues in a Mobile Haptic Interface (MHI) and their improvements. The improvements can be categorized in three parts: 1) high-accuracy estimation of the world position of the haptic interface point, 2) motion planning algorithm to move the mobile base while avoiding collisions with the user and other objects, and 3) closed-loop force control to compensate the undesired effect of mobile base dynamics on the final rendering force perceived by the user.

  • PDF

Mobile Haptic Interface for Large Immersive Virtual Environments: PoMHI v0.5 (대형 가상환경을 위한 이동형 햅틱 인터페이스: PoMHI v0.5)

  • Lee, Chae-Hyun;Hong, Min-Sik;Lee, In;Choi, Oh-Kyu;Han, Kyung-Lyong;Kim, Yoo-Yeon;Choi, Seung-Moon;Lee, Jin-Soo
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.2
    • /
    • pp.137-145
    • /
    • 2008
  • We present the initial results of on-going research for building a novel Mobile Haptic Interface (MHI) that can provide an unlimited haptic workspace in large immersive virtual environments. When a user explores a large virtual environment, the MHI can sense the position and orientation of the user, place itself to an appropriate configuration, and deliver force feedback, thereby enabling a virtually limitless workspace. Our MHI (PoMHI v0.5) features with omnidirectional mobility, a collision-free motion planning algorithm, and force feedback for general environment models. We also provide experimental results that show the fidelity of our mobile haptic interface.

  • PDF

The Development of a Haptic Interface for Interacting with BIM Elements in Mixed Reality

  • Cho, Jaehong;Kim, Sehun;Kim, Namyoung;Kim, Sungpyo;Park, Chaehyeon;Lim, Jiseon;Kang, Sanghyeok
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.1179-1186
    • /
    • 2022
  • Building Information Modeling (BIM) is widely used to efficiently share, utilize and manage information generated in every phase of a construction project. Recently, mixed reality (MR) technologies have been introduced to more effectively utilize BIM elements. This study deals with the haptic interactions between humans and BIM elements in MR to improve BIM usability. As the first step in interacting with virtual objects in mixed reality, we challenged moving a virtual object to the desired location using finger-pointing. This paper presents the procedure of developing a haptic interface system where users can interact with a BIM object to move it to the desired location in MR. The interface system consists of an MR-based head-mounted display (HMD) and a mobile application developed using Unity 3D. This study defined two segments to compute the scale factor and rotation angle of the virtual object to be moved. As a result of testing a cuboid, the user can successfully move it to the desired location. The developed MR-based haptic interface can be used for aligning BIM elements overlaid in the real world at the construction site.

  • PDF

Design of Ball-based Mobile Haptic Interface (볼 기반의 모바일 햅틱 인터페이스 디자인)

  • Choi, Min-Woo;Kim, Joung-Hyun
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.122-128
    • /
    • 2009
  • In this paper, we present a design and an evaluation of a hand-held ball based haptic interface, named "TouchBall." Using a trackball mechanism, the device provides flexibility in terms of directional degrees of freedom. It also has an advantage of a direct transfer of force feedback through frictional touch (with high sensitivity), thus requiring only relatively small amount of inertia. This leads to a compact hand-held design appropriate for mobile and 3D interactive applications. The device is evaluated for the detection thresholds for directions of the force feedback and the perceived amount of directional force. The refined directionality information should combine with other modalities with less sensory conflict, enriching the user experience for a given application.

  • PDF

Robot Mobile Control Technology Using Robot Arm as Haptic Interface (로봇의 팔을 햅틱 인터페이스로 사용하여 로봇의 이동을 제어하는 기술)

  • Jung, Yu Chul;Lee, Seongsoo
    • Journal of IKEEE
    • /
    • v.17 no.1
    • /
    • pp.44-50
    • /
    • 2013
  • This paper proposed the implementation of haptic-based robot which is following a human by using fundamental sensors on robot arms without additional sensors. Joints in the robot arms have several motors, and their angles can be read out by these motors when a human pushes or pulls the robot arms. So these arms can be used as haptic sensors. The implemented robot follows a human by interacting with robot arms and human hands, as a human follows a human by hands.

Multimodal Interaction on Automultiscopic Content with Mobile Surface Haptics

  • Kim, Jin Ryong;Shin, Seunghyup;Choi, Seungho;Yoo, Yeonwoo
    • ETRI Journal
    • /
    • v.38 no.6
    • /
    • pp.1085-1094
    • /
    • 2016
  • In this work, we present interactive automultiscopic content with mobile surface haptics for multimodal interaction. Our system consists of a 40-view automultiscopic display and a tablet supporting surface haptics in an immersive room. Animated graphics are projected onto the walls of the room. The 40-view automultiscopic display is placed at the center of the front wall. The haptic tablet is installed at the mobile station to enable the user to interact with the tablet. The 40-view real-time rendering and multiplexing technology is applied by establishing virtual cameras in the convergence layout. Surface haptics rendering is synchronized with three-dimensional (3D) objects on the display for real-time haptic interaction. We conduct an experiment to evaluate user experiences of the proposed system. The results demonstrate that the system's multimodal interaction provides positive user experiences of immersion, control, user interface intuitiveness, and 3D effects.

Remote Control of a Mobile Robot Using Human Adaptive Interface (사용자 적응 인터페이스를 사용한 이동로봇의 원격제어)

  • Hwang, Chang-Soon;Lee, Sang-Ryong;Park, Keun-Young;Lee, Choon-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.777-782
    • /
    • 2007
  • Human Robot Interaction(HRI) through a haptic interface plays an important role in controlling robot systems remotely. The augmented usage of bio-signals in the haptic interface is an emerging research area. To consider operator's state in HRI, we used bio-signals such as ECG and blood pressure in our proposed force reflection interface. The variation of operator's state is checked from the information processing of bio-signals. The statistical standard variation in the R-R intervals and blood pressure were used to adaptively adjust force reflection which is generated from environmental condition. To change the pattern of force reflection according to the state of the human operator is our main idea. A set of experiments show the promising results on our concepts of human adaptive interface.

Teleloperation of Field Mobile Manipulator with Wearable Haptic-based Multi-Modal User Interface and Its Application to Explosive Ordnance Disposal

  • Ryu Dongseok;Hwang Chang-Soon;Kang Sungchul;Kim Munsang;Song Jae-Bok
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.10
    • /
    • pp.1864-1874
    • /
    • 2005
  • This paper describes a wearable multi-modal user interface design and its implementation for a teleoperated field robot system. Recently some teleoperated field robots are employed for hazard environment applications (e.g. rescue, explosive ordnance disposal, security). To complete these missions in outdoor environment, the robot system must have appropriate functions, accuracy and reliability. However, the more functions it has, the more difficulties occur in operation of the functions. To cope up with this problem, an effective user interface should be developed. Furthermore, the user interface is needed to be wearable for portability and prompt action. This research starts at the question: how to teleoperate the complicated slave robot easily. The main challenge is to make a simple and intuitive user interface with a wearable shape and size. This research provides multi-modalities such as visual, auditory and haptic sense. It enables an operator to control every functions of a field robot more intuitively. As a result, an EOD (explosive ordnance disposal) demonstration is conducted to verify the validity of the proposed wearable multi-modal user interface.

Design of a 6-DOF Parallel Haptic Rand Controller Consisting of 5-Bar Linkages and Gimbal Mechanisms (5절링크와 짐벌기구로 구성된 병렬형 6자유도 햅틱 핸드컨트롤러의 설계)

  • Ryu, Dong-Seok;Sohn, Won-Sun;Song, Jae-Bok
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.27 no.1
    • /
    • pp.18-25
    • /
    • 2003
  • A haptic hand controller (HHC) operated by the user’s hand can receive information on position and orientation of the hand and display force and moment generated in the virtual environment to the hand. In this paper, a 3-DOF hand controller is first presented, in which all the actuators are mounted on the fixed base by combining a 5-bar linkage and a gimbal mechanism. The 6-DOF HHC is then designed by connecting these two 3-DOF devices through a handle which consists of a screw and nut. Analysis using performance index is carried out to determine the dimensions of the device. The HHC control system consists of the high-level controller for kinematic and static analysis and the low-level controller for position sensing and motor control. The HHC used as a user interface to control the mobile robot in the virtual environment is given as a simple application.

Tele-Manipulation of ROBHAZ-DT2 for Hazard Environment Applications

  • Ryu, Dong-Seok;Lee, Jong-Wha;Yoon, Seong-Sik;Kang, Sung-Chul;Song, Jae-Bok;Kim, Mun-Sang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2051-2056
    • /
    • 2003
  • In this paper, a tele-manipulation in explosive ordnance disposal(EOD) applications is discussed. The ROBHAZ-DT2 is developed as a teleoperated mobile manipulator for EOD. In general, it has been thought that the robot must have appropriate functions and accuracy enough to handle the complicated and dangerous mission. However, the research on the ROBHAZ-DT2 revealed that the teleoperation causes more restrictions and difficulties in EOD mission. Thus to solve the problem, a novel user interface for the ROBHAZ-DT2 is developed, in which the operator can interact with various human senses (i.e. visual, auditory and haptic sense). It enables an operator to control the ROBHAZ-DT2 simply and intuitively. A tele-manipulation control scheme for the ROBHAZ-DT2 is also proposed including compliance control via force feedback. It makes the robot adapt itself to circumstances, while the robot faithfully follows a command of the operator. This paper deals with a detailed description on the user interface and the tele-manipulation control for the ROBHAZ-DT2. An EOD demonstration is conducted to verify the validity of the proposed interface and the control scheme.

  • PDF