DOI QR코드

DOI QR Code

Mixed Reality Extension System Using Beam Projectors : Beyond the Sight

빔 프로젝터를 이용한 혼합현실 확장 시스템 : Beyond the Sight

  • 김종용 (동국대학교 영상대학원 멀티미디어학과) ;
  • 송종훈 (동국대학교 영상대학원 멀티미디어학과) ;
  • 박정호 (동국대학교 멀티미디어공학과) ;
  • 남재승 (동국대학교 멀티미디어공학과) ;
  • 윤승현 (동국대학교 멀티미디어공학과) ;
  • 박상훈 (동국대학교 영상대학원 멀티미디어학과)
  • Received : 2019.06.08
  • Accepted : 2019.06.21
  • Published : 2019.07.14

Abstract

Recently commercial mixed-reality devices have be launched and a variety of mixed-reality content has produced, but narrow field of view, which appear to be hardware technical limitations, are mentioned as an important issue for hindering immersion and limiting the scope of use. We propose a new innovative system that cooperate multiple beam projectors and a number of mixed reality devices. Using this technology, users can maximize immersion and minimize frustration of narrow viewing angles through 3D object rendering on background of large 2D screens. This system, named BtS (Beyond the Sight), is implemented on a client-server basis and includes the ability to calibrate between devices, share spatial coordinate systems, and synchronize real-time renderings as core modules. In this paper, each configuration module is described in detail and the possibility of its performance and application is shown through the introduction of mixed reality content case created using BtS system.

최근 혼합현실 디바이스의 상용화로 다양한 혼합현실 콘텐츠들이 제작되고 있지만, 하드웨어 기술적 한계로 나타나는 좁은 시야각은 몰입감을 저해하고 활용범위를 제한하는 중요한 원인으로 언급되고 있다. 본 논문에서는 복수의 빔 프로젝터와 다수의 혼합현실 디바이스를 연동시키는 새롭고 혁신적인 시스템을 제안한다. 이를 이용하면 대형 2D 화면을 배경으로 3D 객체 렌더링을 통해 체험자의 몰입감을 극대화 하고 좁은 시야각의 답답함을 최소화 할 수 있다. BtS라는 이름의 본 시스템은 클라이언트-서버 기반으로 구현되었으며, 핵심 모듈로 장치간 캘리브레이션, 공간 좌표계 공유, 실시간 렌더링 동기화 기능 등을 포함한다. 본 논문에서는 각 구성 모듈에 대해 자세히 설명하고, BtS 시스템을 이용해 제작된 혼합현실 콘텐츠 사례 소개를 통해 성능과 응용 가능성을 보이고자 한다.

Keywords

References

  1. P. Milgram and F. Kishino, "A taxonomy of mixed reality visual displays," IEICE TRANSACTIONS on Information and Systems, vol. 77, no. 12, pp. 1321-1329, 1994.
  2. Y. Ohta and H. Tamura, Mixed reality: merging real and virtual worlds. Springer Publishing Company, Incorporated, 2014.
  3. H. Chen, A. S. Lee, M. Swift, and J. C. Tang, "3d collaboration method over hololensTMand skypeTMend points," in Proceedings of the 3rd International Workshop on Immersive Media Experiences, ser. ImmersiveME '15. New York, NY, USA: ACM, 2015, pp. 27-30. [Online]. Available: http://doi.acm.org/10.1145/2814347.2814350
  4. Y.-T. Yue, Y.-L. Yang, G. Ren, and W. Wang, "Scenectrl: Mixed reality enhancement via efficient scene editing," in Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, ser. UIST '17. New York, NY, USA: ACM, 2017, pp. 427-436. [Online]. Available: http://doi.acm.org/10.1145/3126594.3126601
  5. MagicLeap, https://www.magicleap.com/, 2019.
  6. Microsoft, https://www.microsoft.com/en-us/hololens, 2019.
  7. UnityTechnologies, https://unity.com/kr, 2019.
  8. B. R. Jones, H. Benko, E. Ofek, and A. D. Wilson, "Illumiroom: Peripheral projected illusions for interactive experiences," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ser. CHI '13. New York, NY, USA: ACM, 2013, pp. 869-878. [Online]. Available: http://doi.acm.org/10.1145/2470654.2466112
  9. T. Okatani and K. Deguchi, "Easy calibration of a multiprojector display system," International journal of computer vision, vol. 85, no. 1, pp. 1-18, 2009. https://doi.org/10.1007/s11263-009-0242-0
  10. J. Zhou, L. Wang, A. Akbarzadeh, and R. Yang, "Multi-projector display with continuous self-calibration," in Proceedings of the 5th ACM/IEEE International Workshop on Projector Camera Systems, ser. PROCAMS '08. New York, NY, USA: ACM, 2008, pp. 3:1-3:7. [Online]. Available: http://doi.acm.org/10.1145/1394622.1394626
  11. H. S. Oluwatosin, "Client-server model," IOSRJ Comput. Eng, vol. 16, no. 1, pp. 2278-8727, 2014.
  12. M. Bailey and S. Cunningham, Graphics shaders: theory and practice. AK Peters/CRC Press, 2016.
  13. A. Zucconi and K. Lammers, Unity 5. x Shaders and Effects Cookbook. Packt Publishing Ltd, 2016.
  14. Microsoft, https://github.com/microsoft/MixedRealityToolkit-Unity, 2019.
  15. T. Frantz, B. Jansen, J. Duerinck, and J. Vandemeulebroucke, "Augmenting microsoft's hololens with vuforia tracking for neuronavigation," Healthcare technology letters, vol. 5, no. 5, pp. 221-225, 2018. https://doi.org/10.1049/htl.2018.5079
  16. HoloLens, https://docs.microsoft.com/ko-kr/windows/mixed-reality/spectator-view, 2019.
  17. S. Willi and A. Grundhofer, "Robust geometric selfcalibration of generic multi-projector camera systems," in 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 2017, pp. 42-51.
  18. B. Jones, R. Sodhi, M. Murdock, R. Mehra, H. Benko, A. Wilson, E. Ofek, B. MacIntyre, N. Raghuvanshi, and L. Shapira, "Roomalive: Magical experiences enabled by scalable, adaptive projector-camera units," in Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, ser. UIST '14. New York, NY, USA: ACM, 2014, pp. 637-644. [Online]. Available: http://doi.acm.org/10.1145/2642918.2647383