• Title/Summary/Keyword: Immersive Technologies

Search Result 91, Processing Time 0.015 seconds

Video classifier with adaptive blur network to determine horizontally extrapolatable video content (적응형 블러 기반 비디오의 수평적 확장 여부 판별 네트워크)

  • Minsun Kim;Changwook Seo;Hyun Ho Yun;Junyong Noh
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.99-107
    • /
    • 2024
  • While the demand for extrapolating video content horizontally or vertically is increasing, even the most advanced techniques cannot successfully extrapolate all videos. Therefore, it is important to determine if a given video can be well extrapolated before attempting the actual extrapolation. This can help avoid wasting computing resources. This paper proposes a video classifier that can identify if a video is suitable for horizontal extrapolation. The classifier utilizes optical flow and an adaptive Gaussian blur network, which can be applied to flow-based video extrapolation methods. The labeling for training was rigorously conducted through user tests and quantitative evaluations. As a result of learning from this labeled dataset, a network was developed to determine the extrapolation capability of a given video. The proposed classifier achieved much more accurate classification performance than methods that simply use the original video or fixed blur alone by effectively capturing the characteristics of the video through optical flow and adaptive Gaussian blur network. This classifier can be utilized in various fields in conjunction with automatic video extrapolation techniques for immersive viewing experiences.