DOI QR코드

DOI QR Code

Application of Immersive Virtual Environment Through Virtual Avatar Based On Rigid-body Tracking

강체 추적 기반의 가상 아바타를 통한 몰입형 가상환경 응용

  • MyeongSeok Park (Department of Computer Engineering, Graduate School, Hansung University) ;
  • Jinmo Kim (Department of Computer Engineering, Graduate School, Hansung University)
  • 박명석 (한성대학교 일반대학원 컴퓨터공학과) ;
  • 김진모 (한성대학교 일반대학원 컴퓨터공학과)
  • Received : 2023.06.16
  • Accepted : 2023.07.05
  • Published : 2023.07.25

Abstract

This study proposes a rigid-body tracking based virtual avatar application method to increase the social presence and provide various experiences of virtual reality(VR) users in an immersive virtual environment. The proposed method estimates the motion of a virtual avatar through inverse kinematics based on real-time rigid-body tracking based on motion capture using markers. Through this, it aims to design a highly immersive virtual environment with simple object manipulation in the real world. Science experiment educational contents are produced to experiment and analyze applications related to immersive virtual environments through virtual avatars. In addition, audiovisual education, full-body tracking, and the proposed rigid-body tracking method were compared and analyzed through survey. In the proposed virtual environment, participants wore VR HMDs and conducted a survey to confirm immersion and educational effects from virtual avatars performing experimental educational actions from estimated motions. As a result, through the method of utilizing virtual avatars based on rigid-body tracking, it was possible to induce higher immersion and educational effects than traditional audiovisual education. In addition, it was confirmed that a sufficiently positive experience can be provided without much work for full-body tracking.

본 연구는 몰입형 가상환경에서의 가상현실 사용자의 사회적 현존감을 높이고 다양한 경험을 제공하기 위하여 강체 추적 기반의 가상 아바타 응용 방법을 제안한다. 제안하는 방법은 마커를 사용한 모션 캡처 기반의 실시간 강체 추적을 기반으로 역운동학을 통해 가상 아바타의 동작을 추정한다. 이를 통해 현실 세계에서의 간단한 객체 조작으로 몰입감 높은 가상환경을 설계함을 목적으로 한다. 가상 아바타를 통한 몰입형 가상환경에 관한 응용을 실험 및 분석하기 위하여 과학실험 교육 콘텐츠를 제작하고 시청각 교육, 전신 추적, 그리고 제안하는 강체 추적 방법과의 설문을 통해 비교 분석하였다. 제안한 가상환경에서 참가자들은 가상현실 HMD를 착용하고 추정된 동작으로부터 실험 교육 행동을 수행하는 가상 아바타로부터 몰입과 교육 효과를 확인하기 위한 설문을 진행하였다. 결과적으로 강체 추적 기반의 가상 아바타를 활용하는 방법을 통해 전통적인 시청각 교육보다 높은 몰입과 교육 효과를 유도할 수 있었으며, 전신 추적을 위한 많은 작업 없이도 충분히 긍정적인 경험을 제공할 수 있음을 확인하였다.

Keywords

Acknowledgement

본 연구는 한성대학교 학술연구비 지원과제임 (김진모, Jinmo Kim).

References

  1. Y. Cho and J. Kim, "A Study on the Comparison of the Virtual Reality Development Environment in Unity and Unreal Engine 4," Journal of the Korea Computer Graphics Society, vol. 28, no. 5, pp. 1-11, 2022. https://doi.org/10.15701/kcgs.2022.28.2.1
  2. T. Sweeney, "Foundational principles & technologies for the metaverse," In ACM SIGGRAPH 2019 Talks (SIGGRAPH '19), Association for Computing Machinery, New York, NY, USA, Article 38, pp. 1, 2019.
  3. H. Duan, J. Li, S. Fan, Z. Lin, X. Wu, and W. Cai, "Metaverse for Social Good: A University Campus Prototype," In Proceedings of the 29th ACM International Conference on Multimedia (MM '21). Association for Computing Machinery, New York, NY, USA, 153-161, 2021.
  4. A. Gaafar, "Metaverse In Architectural Heritage Documentation & Education," Advances in Ecological and Environmental Research, vol. 6, issue 10, pp. 66-86, 2021.
  5. N. Xi, J. Chen, F. Gama, M. Riar and J. Hamari "The challenges of entering the metaverse: An experiment on the effect of extended reality on workload," Information Systems Frontiers, vol. 25, pp. 659-680, 2023. https://doi.org/10.1007/s10796-022-10244-x
  6. A. van Dam, "Post-wimp user interfaces," Communications of the ACM, Association for Computing Machinery, vol. 40, no.2, pp. 63-67, 1997. https://doi.org/10.1145/253671.253708
  7. M. Kim, J. Lee, C. Jeon and J. Kim, "A Study on Interaction of Gaze-based User Interface in Mobile Virtual Reality Environment," Journal of the Korea Computer Graphics Society, vol. 23, no. 3, pp. 39-46, 2017. https://doi.org/10.15701/kcgs.2017.23.3.39
  8. J. Kim, "Gadget Arms: Interactive Data Visualization using Hand Gesture in Extended Reality," Journal of the Korea Computer Graphics Society, vol. 25, no. 2, pp. 1-9, 2019. https://doi.org/10.15701/kcgs.2019.25.2.1
  9. S. Hong, G. Na, Y. Cho and J. Kim, "A Study on Movement Interface in Mobile Virtual Reality," Journal of the Korea Computer Graphics Society, vol. 27, no. 3, pp. 55-63, 2021. https://doi.org/10.15701/kcgs.2021.27.3.55
  10. N. Pelechano, C. Stocker, J. Allbeck and N. Badler, "Being a Part of the Crowd: Towards Validating VR Crowds Using Presence," Autonomous Agents and Multiagent Systems, vol. 1, pp. 136-142. 2008.
  11. Wikipedia, "Motion Capture," [Internet] Available: https://en.wikipedia.org/wiki/Motion_capture, 2023.
  12. D. Mehta, O. Sotnychenko, F. Mueller, W. Xu, M. Elgharib, P. Fua, H. Seidel, H. Rhodin, G. Pons-Moll, and C. Theobalt, "XNect: real-time multi-person 3D motion capture with a single RGB camera," ACM Transactions on Graphics, vol. 39, no. 4, Article 82, pp. 1-17, 2020. https://doi.org/10.1145/3386569.3392410
  13. A. Chatzitofis, G. Albanis, N. Zioulis and S. Thermos, "A Low-cost & Realtime Motion Capture System," 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, pp. 21421-21426, 2022.
  14. L. Zhang, "Digital Protection of Dance of Intangible Cultural Heritage by Motion Capture Technology," Lecture Notes on Data Engineering and Communications Technologies, vol 85, pp. 429-436, 2022. https://doi.org/10.1007/978-981-16-5854-9_54
  15. J. Kim, D. Kang, Y. Lee and T. Kwon, "Real-time Interactive Animation System for Low-Priced Motion Capture Sensors," Journal of the Korea Computer Graphics Society, vol. 28, no. 2, pp. 29-41, 2022. https://doi.org/10.15701/kcgs.2022.28.2.29
  16. T. Bae, E. Lee, H. Kim, M. Park and M. Choi, "Character Motion Control by Using Limited Sensors and Animation Data," Journal of the Korea Computer Graphics Society, vol. 25, no. 3, pp. 85-92, 2019. https://doi.org/10.15701/kcgs.2019.25.3.85
  17. A. von der Putten, N. Kramer, J. Gratch and S. Kang, ""It doesn't matter what you are!" Explaining social effects of agents and avatars," Computers in Human Behavior, vol. 26, issue 6, pp. 1641-1650, 2010. https://doi.org/10.1016/j.chb.2010.06.012
  18. NaturalPoint, Motive software and Unity Plugin, [Internet] Available: https://optitrack.com/software/, NaturalPoint, Inc. DBA, 2023.
  19. NaturalPoint, "Rigid Body Tracking," [Internet] Available: https://docs.optitrack.com/motive/rigid-body-tracking, 2022.
  20. Unity engine, UnityTechnologies, [Internet] Available: https://unity.com/, 2021.
  21. 박지은, "화재.폭발 '안전 불감증' 여전한 학교 실험실," 강원도민일보, 7면, [Internet] Available: http://www.kado.net/news/articleView.html?idxno=860953, 2017.
  22. 이현기, "원주 한 초등학교에서 과학실험 어지러움 호소," KBS 뉴스, [Internet] Available: https://news.kbs.co.kr/news/view.do?ncd=5603888, 2022.
  23. Archviz Laboratory Pack, [Internet] Available: https://assetstore.unity.com/packages/3d/environments/archviz-laboratory-pack-144317, IO-Studio, 2020.
  24. Urban Man Character, [Internet] Available: https://assetstore.unity.com/packages/3d/characters/humanoids/humans/urban-man-character-213574, BEAR3D, 2022.
  25. 김부영, "위험성 때문에 지금은 교과서에서 사라진 화산실험 (feat. 중크롬산 암모늄)," [Internet] Available: https://www.youtube.com/watch?v=ToOPo-o5ng4, 2020.
  26. 행복부자, "염산과 수산화나트륨수용액의 중화반응," [Internet] Available: https://www.youtube.com/watch?v=UNe4jf9Pl48, 2019.
  27. B. Witmer., C. Jerome, and M. Singer, "The factor structure of the presence questionnaire," Presence: Teleoperators and Virtual Environments, vol. 14, no. 3, pp. 298-312, 2005. https://doi.org/10.1162/105474605323384654