The Development of a Haptic Interface for Interacting with BIM Elements in Mixed Reality

  • Cho, Jaehong (Department of Civil and Environmental Engineering, Incheon National University) ;
  • Kim, Sehun (R&D Business dept., DoIT Co.,LTD.) ;
  • Kim, Namyoung (R&D Business dept., DoIT Co.,LTD.) ;
  • Kim, Sungpyo (R&D Business dept., DoIT Co.,LTD.) ;
  • Park, Chaehyeon (Department of Civil and Environmental Engineering, Incheon National University) ;
  • Lim, Jiseon (Department of Civil and Environmental Engineering, Incheon National University) ;
  • Kang, Sanghyeok (Department of Civil and Environmental Engineering, Incheon National University)
  • Published : 2022.06.20

Abstract

Building Information Modeling (BIM) is widely used to efficiently share, utilize and manage information generated in every phase of a construction project. Recently, mixed reality (MR) technologies have been introduced to more effectively utilize BIM elements. This study deals with the haptic interactions between humans and BIM elements in MR to improve BIM usability. As the first step in interacting with virtual objects in mixed reality, we challenged moving a virtual object to the desired location using finger-pointing. This paper presents the procedure of developing a haptic interface system where users can interact with a BIM object to move it to the desired location in MR. The interface system consists of an MR-based head-mounted display (HMD) and a mobile application developed using Unity 3D. This study defined two segments to compute the scale factor and rotation angle of the virtual object to be moved. As a result of testing a cuboid, the user can successfully move it to the desired location. The developed MR-based haptic interface can be used for aligning BIM elements overlaid in the real world at the construction site.

Keywords

Acknowledgement

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (Ministry of Science and ICT) (No. 2021R1A2C2014488).