DOI QR코드

DOI QR Code

Microsoft Kinect-based Indoor Building Information Model Acquisition

Kinect(RGB-Depth Camera)를 활용한 실내 공간 정보 모델(BIM) 획득

  • Kim, Junhee (Department of Architectural Engineering, Dankook University) ;
  • Yoo, Sae-Woung (Department of Architectural Engineering, Dankook University) ;
  • Min, Kyung-Won (Department of Architectural Engineering, Dankook University)
  • Received : 2018.06.11
  • Accepted : 2018.07.23
  • Published : 2018.08.31

Abstract

This paper investigates applicability of Microsoft $Kinect^{(R)}$, RGB-depth camera, to implement a 3D image and spatial information for sensing a target. The relationship between the image of the Kinect camera and the pixel coordinate system is formulated. The calibration of the camera provides the depth and RGB information of the target. The intrinsic parameters are calculated through a checker board experiment and focal length, principal point, and distortion coefficient are obtained. The extrinsic parameters regarding the relationship between the two Kinect cameras consist of rotational matrix and translational vector. The spatial images of 2D projection space are converted to a 3D images, resulting on spatial information on the basis of the depth and RGB information. The measurement is verified through comparison with the length and location of the 2D images of the target structure.

본 연구에서는 건물 실내 공간 정보 획득을 위해 Microsoft사의 $Kinect^{(R)}$ v2를 활용한 point cloud 기법을 도입하였다. 카메라로 취득한 2차원의 투영 공간 이미지 픽셀 좌표를 각 카메라의 보정을 거쳐 3차원 이미지 변환하며 이를 토대로 공간 정보를 구현하였다. 기준점을 중심으로 $360^{\circ}$ 회전하여 취득한 3차원 이미지를 통해 거리 측정이 불가한 기존의 2차원 이미지의 한계를 개선하였으며, 이 과정을 통해 얻은 point cloud를 통해 3차원 map을 형성하였다. 형성된 3차원 map은 기존의 공간정보 융 복합을 위한 센서와 비슷한 수준의 측정 효율을 가지면서 동시에 렌즈 왜곡 현상에 대한 후처리 과정을 통해 공간 정보를 정확하게 측정할 수 있도록 하였다. 측정한 결과를 2D 도면과 실제 공간 및 구조부재의 길이 및 위치 등과 비교하여 검증하였다.

Keywords

References

  1. Bc, J.S. (2011) 3D Camera Calibration, Center For Machine Perception, pp.4-34.
  2. Branko, K. (2015) Calibration of Kinect type RGB-D Sensors for Robotic Applications, FME Trans., 47, pp.47-54.
  3. Dal Mutto, C., Zanuttigh, P., Cortelazzo, G.M. (2013) Time of Flight Cameras and Microsoft Kinect, pp.3-68.
  4. Lembit, V. (2016) 3D Reconstruction using Kinect v2 Camera, Bachelor's thesis (12 ECTP).
  5. Michae, J.L., Benjamin, Y.C., Peter, A.B. (2016) Simulating Kinect Infrared and Depth Images, IEEE Trans. Cybern., 6(12), pp.3018-3031.
  6. Nima, R., Jie, G., Mohsin, K.S., Chris, G., H. Elix, L. (2012) Analysis of XBOX Kinect Sensor Data for use on Construction Sites: Depth Accuracy and Sensor Interference Assessment, Construction Research Congress, pp.848-857.
  7. Smisek, J., Jancosek, M., Paidla, T. (2011) 3D with Kinect, 2011 IEEE Int. Conf. Computer Vision Workshops, Barcelona, Spain, pp.1154-1160.
  8. Walid, D., Shenjum, T., Wenbin, L., Wu, C. (2017) A New Calibration Method for Commercial RGB-D Sensors, Sensor, 17.
  9. Yulu, L.C., Mohamed, A., Mohammad, R.J., Sami, F.M. (2017) Color and Depth Data Fusion using and RGB-D Sensor for Inexpensive and Contactless Dynamic Displacement Field Measurement, Wiley, pp.1-14.
  10. https://en.wikipedia.org/wiki/Tango_(platform)
  11. https://en.wikipedia.org/wiki/Velodyne_LiDAR
  12. https://en.wikipedia.org/wiki/Quanergy
  13. https://en.wikipedia.org/wiki/Faro
  14. https://en.wikipedia.org/wiki/Sick_AG
  15. http://www.zf-laser.com/Z-F-PROFILER-R-9012.2d_laserscanner.0.html?&L=1
  16. http://www.riegl.com/company/about-riegl/
  17. https://www.ibeo-as.com/projects/
  18. https://autonomoustuff.com/product-category/lidar/hokuyo-laser-scanners/