DOI QR코드

DOI QR Code

대형 가상현실 공연장을 위한 360도 비디오 스트리밍 시스템

360-degree Video Streaming System for Large-scale Immersive Displays

  • Yeongil, Ryu (Department of Computer Science Education, Sungkyunkwan University) ;
  • Kon Hyong, Kim (Media Arts and Technology, University of California) ;
  • Andres, Cabrera (Media Arts and Technology, University of California) ;
  • JoAnn, Kuchera-Morin (Media Arts and Technology, University of California) ;
  • Sehun, Jeong (Department of Computer Education, Sungkyunkwan University) ;
  • Eun-Seok, Ryu (Department of Computer Science Education, Sungkyunkwan University)
  • 투고 : 2022.08.18
  • 심사 : 2022.10.11
  • 발행 : 2022.11.30

초록

본 논문은 일반적으로 사용되는 2D 디스플레이 또는 HMD (Head-Mounted Display) 기반 VR (Virtual Reality, VR) 서비스에서 탈피하여, 대형 가상현실 공연장을 위한 360도 비디오 스트리밍 시스템을 제안한다. 제안된 시스템은 Phase 1, 2, 3의 연구개발 단계를 밟아 6DoF (Degrees of Freedom) 시점 자유도를 지원하는 360도 비디오 스트리밍 시스템을 개발하는 것을 최종목표로 하고 있으며, 현재는 Phase 1: 대형 가상현실 공연장을 위한 3DoF 360도 비디오 스트리밍 시스템 프로토타입의 개발까지 완료되었다. 구현된 스트리밍 시스템 프로토타입은 서브픽처 기반 Viewport-dependent 스트리밍 기술이 적용되어 있으며, 기존 방식과 비교하였을 때 약 80%의 비트율 감소, 약 543%의 영상 디코딩 속도 향상을 확인하였다. 또한, 단순 구현 및 성능평가에서 그치지 않고, 실제 미국 UCSB (University of California, Santa Barbara)에 위치한 대형 가상현실 공연장 AlloSphere에서의 시범방송을 수행하여, 향후 Phase 2, 3 연구단계를 위한 연구적 기반을 마련하였다.

This paper presents a novel 360-degree video streaming system for large-scale immersive displays and its ongoing implementation. Recent VR systems aim to provide a service for a single viewer on HMD. However, the proposed 360-degree video streaming system enables multiple viewers to explore immersive contents on a large-scale immersive display. The proposed 360-degree video streaming system is being developed in 3 research phases, with the final goal of providing 6DoF. Currently, the phase 1: implementation of the 3DoF 360-degree video streaming system prototype is finished. The implemented prototype employs subpicture-based viewport-dependent streaming technique, and it achieved bit-rate saving of about 80% and decoding speed up of 543% compared to the conventional viewport-independent streaming technique. Additionally, this paper demonstrated the implemented prototype on UCSB AlloSphere, the large-scale instrument for immersive media art exhibition.

키워드

과제정보

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No.RS-2022-00167169, Development of Moving Robot-based Immersive Video Acquisition and Processing System in Metaverse).

참고문헌

  1. "BTS to hold virtual concert 'Permission To Dance On Stage' in October," https://www.ajudaily.com/view/20210915153706063 (accessed Aug. 1, 2022)
  2. "U.S. Billboard highlighted NCT 127's 'Beyond the Origin'!," https://www.smentertainment.com/PressCenter/Details/4476 (accessed Aug. 1, 2022)
  3. SPRI, Untact Era, Applications of XR (Extended Reality) in Performance Industry, Monthly Software Oriented Society, Vol.74, 2020.
  4. J. M. Boyce, R. Dore, A. Dziembowski, J. Fleureau, J. Jung, B. Kroon, B. Salahieh, V. K. M. Vadakital, and L. Yu, "MPEG immersive video coding standard," Proceedings of the IEEE, Vol.109, No.9, pp.1521-1536, 2021. doi: https://doi.org/10.1109/JPROC.2021.3062590
  5. Y. Ryu, and E.-S. Ryu, "Overview of Motion-to-Photon Latency Reduction for Mitigating VR Sickness," KSII Transactions on Internet and Information Systems (TIIS), Vol.15, No.7, pp.2531-2546, 2021. doi: https://doi.org/10.3837/tiis.2021.07.013
  6. B. Bross, Y. K. Wang, Y. Ye, S. Liu, J. Chen, G. J. Sullivan, and J. R. Ohm, "Overview of the versatile video coding (VVC) standard and its applications," IEEE Transactions on Circuits and Systems for Video Technology, Vol.31, No.10, pp.3736-3764, 2021. doi: https://doi.org/10.1109/TCSVT.2021.3101953
  7. J. S. Lee, J. T. Park, H. S. Choe, J. H. Byeon, and D. G. Sim, "Overview of VVC," Broadcasting and Media Magazine, Vol.24, No.4, pp.10-25, 2019.
  8. F. Bossen, K. Suhring, A. Wieckowski, and S. Liu, "VVC Complexity and Software Implementation Analysis," IEEE Transactions on Circuits and Systems for Video Technology, Vol.31, No.10, pp.3765-3778, 2021. doi: https://doi.org/10.1109/TCSVT.2021.3072204
  9. Y.-K., Wang, R. Skupin, M. M. Hannuksela, S. Deshpande, Hendry, Drugeon, Virginie, R. Sjoberg, B. Choi, V. Seregin, Y. Sanchez, J. M. Boyce, W. Wan, and G. J. Sullivan, "The high-level syntax of the versatile video coding (VVC) standard," IEEE Transactions on Circuits and Systems for Video Technology, Vol.31, No.10, pp.3779-3800, 2021. doi: https://doi.org/10.1109/TCSVT.2021.3070860
  10. VTM reference software for VVC, https://vcgit.hhi.fraunhofer.de/jvet/VVCSoftware_VTM (accessed Aug. 1, 2022)
  11. Fraunhofer Versatile Video Encoder (VVenC), https://github.com/fraunhoferhhi/vvenc (accessed Aug. 1, 2022)
  12. Fraunhofer Versatile Video Decoder (VVdeC), https://github.com/fraunhoferhhi/vvdec (accessed Aug. 1, 2022)
  13. OpenVVC, https://github.com/OpenVVC/OpenVVC (accessed Aug. 1, 2022)
  14. Fraunhofer HHI, "VVenC Fraunhofer Versatile Video Encoder," VVenC whitepaper, v1.3.1, 2021.
  15. A. Wieckowski, G. Hege, C. Bartnik, C. Lehmann, C. Stoffers, B. Bross, and D. Marpe, "Towards a live software decoder implementation for the upcoming versatile video coding (VVC) codec," Proceeding of 2020 IEEE International Conference on Image Processing (ICIP), pp.3124-3128, 2020. doi: https://doi.org/10.1109/ICIP40778.2020.9191199
  16. T. Amestoy, P. L. Cabarat, G. Gautier, W. Hamidouche, and D. Menard, "OpenVVC: a Lightweight Software Decoder for the Versatile Video Coding Standard," arXiv preprint arXiv:2205.12217, 2022. doi: https://doi.org/10.48550/arXiv.2205.12217
  17. A. Yaqoob, T. Bi, and G. M. Muntean, "A survey on adaptive 360 video streaming: Solutions, challenges and opportunities," IEEE Communications Surveys & Tutorials, Vol.22, No.4, pp.2801-2838, 2020. doi: https://doi.org/10.1109/COMST.2020.3006999
  18. R. Skupin, Y. Sanchez, D. Podborski, C. Hellge, and T. Schierl, "Viewport-dependent 360 degree video streaming based on the emerging Omnidirectional Media Format (OMAF) standard," Proceeding of IEEE International Conference on Image Processing (ICIP), pp.4592-4592, 2017. doi: https://doi.org/10.1109/ICIP.2017.8297155
  19. J. W. Son, Y. I. Ryu, H. J. Roh, and E. S. Ryu, "SHVC-based ROI tile parallel processing for mobile virtual reality," Proceeding of the Winter Conference of Korea Institute of Information Scientists and Engineers, pp.1715-1717, 2016.
  20. S. Lee, D. Jang, J. Jeong, and E. S. Ryu, "Motion-constrained tile set based 360-degree video streaming using saliency map prediction," Proceedings of the 29th ACM Workshop on Network and Operating Systems Support for Digital Audio and Video, pp.20-24, 2019. doi: https://doi.org/10.1145/3304112.3325614
  21. T. Hollerer, J. Kuchera-Morin, and X. Amatriain, "The allosphere: a large-scale immersive surround-view instrument," Proceedings of the 2007 workshop on Emerging displays technologies: images and beyond: the future of displays and interacton, pp.3-es, 2007. doi: https://doi.org/10.1145/1278240.1278243
  22. J. Kuchera-Morin, M. Wright, G. Wakefield, C. Roberts, D. Adderton, B. Sajadi, H. Tobias, and A. Majumder, "Immersive full-surround multi-user system design," Computers & Graphics, Vol.40, pp.10-21, 2014. doi: https://doi.org/10.1016/j.cag.2013.12.004
  23. Allolib, https://github.com/AlloSphere-Research-Group/allolib (accessed Aug. 1, 2022)