• Title/Summary/Keyword: Studio camera

Search Result 30, Processing Time 0.02 seconds

Extraction of Camera Parameters Using Projective Invariance for Virtual Studio

  • Han, Seo-Won;Lee, Joon-Whaon;Nakajima, Masayuki
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 1998.06b
    • /
    • pp.141-146
    • /
    • 1998
  • Currently virtual studio has used the cromakey method in which an image is captured, and the blue portion of that image is replaced by a graphic image or a real image. The replaced image must be changed according to the camera motion. This paper proposes a novel method to extract camera parameters using the recognition of pentagonal patterns which are painted on the blue screen. The corresponding parameters are position, direction and focal length of the camera in the virtual studio. At first, pentagonal patterns are found using invariant features of the pentagon. Then, the projective transformation of two projected images and the camera parameters are calculated using the matched points. Simulation results indicate that camera parameters are more easily calculated compared to the conventional methods.

  • PDF

A Study of Changes in Production by Domestic Broadcasters Using Virtual Studio

  • Lee, Jun-Sang;Park, Sung-Dae;Kim, Chee-Yong
    • Journal of information and communication convergence engineering
    • /
    • v.9 no.1
    • /
    • pp.117-123
    • /
    • 2011
  • This paper is for investigation and analysis of how a visual studio is widely used by domestic broadcasting companies and broadcasting companies' production environment. The history of a visual studio goes back to start of computer graphic. A visual studio is a way to produce a program using graphic sets created by a computer rather than actual setting, and it allows to express beyond limitations of actual studio settings and various imagery visual effects can be created by a computer. A visual studio can create 3-dimensional graphic and these graphics can be inter-locked with actual camera images to make visual spaces for various programs. These flows are aiming to achieve very natural image which is hard to distinguish it is artificially created rather than just to produce programs with simple image synthesis, and this paper analyzes the producing changes of domestic broadcasting visual studios as well as its usage and suggests the idealist direction for developing the production.

The Extraction of Camera Parameters using Projective Invariance for Virtual Studio (가상 스튜디오를 위한 카메라 파라메터의 추출)

  • Han, Seo-Won;Eom, Gyeong-Bae;Lee, Jun-Hwan
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.9
    • /
    • pp.2540-2547
    • /
    • 1999
  • Chromakey method is one of key technologies for realizing virtual studio, and the blue portions of a captured image in virtual studio, are replaced with a computer generated or real image. The replaced image must be changed according to the camera parameter of studio for natural merging with the non-blue portions of a captured image. This paper proposes a novel method to extract camera parameters using the recognition of pentagonal patterns that are painted on a blue screen. We extract corresponding points between a blue screen. We extract corresponding points between a blue screen and a captured image using the projective invariant features of a pentagon. Then, calculate camera parameters using corresponding points by the modification of Tsai's method. Experimental results indicate that the proposed method is more accurate compared to conventional method and can process about twelve frames of video per a second in Pentium-MMX processor with CPU clock of 166MHz.

  • PDF

Camera Parameter Extraction Method for Virtual Studio Applications by Tracking the Location of TV Camera (가상스튜디오에서 실사 TV 카메라의 3-D 기준 좌표와 추적 영상을 이용한 카메라 파라메타 추출 방법)

  • 한기태;김회율
    • Journal of Broadcast Engineering
    • /
    • v.4 no.2
    • /
    • pp.176-186
    • /
    • 1999
  • In order to produce an image that lends realism to audience in the virtual studio system. it is important to synchronize precisely between foreground objects and background image provided by computer graphics. In this paper, we propose a method of camera parameter extraction for the synchronization by tracking the pose of TV camera. We derive an equation for extracting camera parameters from inverse perspective equations for tracking the pose of the camera and 3-D transformation between base coordinates and estimated coordinates. We show the validity of the proposed method in terms of the accuracy ratio between the parameters computed from the equation and the real parameters that applied to a TV camera.

  • PDF

Inforamtion Application for The blind people (시각 장애인을 위한 안내정보 어플리케이션)

  • Shin, Eun-bi;Roh, Tae-Kyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2018.05a
    • /
    • pp.358-359
    • /
    • 2018
  • In this paper, opencv and android studio are used to distinguish between objects ahead of the blind. When the movement is detected in a positive direction in connection with the camera of the smartphone, the user is informed that the part of the camera is being rabelified and continues to track using the mean shift algorithm. A C ++ program based on OpenCV-based was used for real-time motion observation and the application will be produced by android studio. As a result of the study, objects that move with Labeling are identified and the box area is specified using the mean shift algorithm to move the box along with the object to track objects in real time.

  • PDF

A Study on Virtual Studio Application using Microsoft Hololens

  • Lee, Jaehyun;Kim, Seunghyeon;Kim, Lyounghui;Kang, Jinwook;Lee, Seunghyun;Kwon, Soonchul
    • International journal of advanced smart convergence
    • /
    • v.6 no.4
    • /
    • pp.80-87
    • /
    • 2017
  • Mixed Reality (MR) shows a composite image of a virtual object in the real world. It has been applied to various fields by the introduction of head mounted display (HMD) such as Microsoft's Hololens [1-3]. The virtual studio in broadcasting combines the contents created by computer graphics with the actual set to reproduce the 3D image screen. This requires physical space such as a set of chroma keys. It also requires professional knowledge and manpower and costly equipment to post-process the graphics and information for long periods of time. Therefore, in this paper, we aim to study the implementation of virtual studio based on Mixed Reality using Microsoft Hololens. Through the implementation of 'Holo-studio' application, realistic and virtual objects of broadcasting camera viewpoint were acquired at the same time. Using Microsoft's spectator view library, the frame rate is degraded in objects with high polygons (100,000 polygons). The proposed method maintains 60 fps image transmission in high polygon objects. The results of this paper show the possibility of using virtual studio at low cost which does not need separate physical space.

Application of Virtual Studio Technology and Digital Human Monocular Motion Capture Technology -Based on <Beast Town> as an Example-

  • YuanZi Sang;KiHong Kim;JuneSok Lee;JiChu Tang;GaoHe Zhang;ZhengRan Liu;QianRu Liu;ShiJie Sun;YuTing Wang;KaiXing Wang
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.1
    • /
    • pp.106-123
    • /
    • 2024
  • This article takes the talk show "Beast Town" as an example to introduce the overall technical solution, technical difficulties and countermeasures for the combination of cartoon virtual characters and virtual studio technology, providing reference and experience for the multi-scenario application of digital humans. Compared with the live broadcast that combines reality and reality, we have further upgraded our virtual production technology and digital human-driven technology, adopted industry-leading real-time virtual production technology and monocular camera driving technology, and launched a virtual cartoon character talk show - "Beast Town" to achieve real Perfectly combined with virtuality, it further enhances program immersion and audio-visual experience, and expands infinite boundaries for virtual manufacturing. In the talk show, motion capture shooting technology is used for final picture synthesis. The virtual scene needs to present dynamic effects, and at the same time realize the driving of the digital human and the movement with the push, pull and pan of the overall picture. This puts forward very high requirements for multi-party data synchronization, real-time driving of digital people, and synthetic picture rendering. We focus on issues such as virtual and real data docking and monocular camera motion capture effects. We combine camera outward tracking, multi-scene picture perspective, multi-machine rendering and other solutions to effectively solve picture linkage and rendering quality problems in a deeply immersive space environment. , presenting users with visual effects of linkage between digital people and live guests.

A Study on the Adequate HD Camera Focal Length in the Broadcasting Studio using LED Video Wall (LED 비디오월을 사용하는 방송환경에서 HD 카메라의 적정 초점거리 연구)

  • Choi, Ki-chang;Kwon, Soon-chul;Lee, Seung-hyun
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.5
    • /
    • pp.713-721
    • /
    • 2022
  • In order to use the LED video wall in the broadcasting studio, there are a few things to be aware of. First, since the pixels are closely arranged, a moire phenomenon may occur due to a short arrangement period, and second, the distance between pixels (pixel pitch) may be recorded on the image sensor of the broadcasting camera. When moire occurs or pixel pitch is observed, viewers feel uncomfortable. Moire effect can be reduced by adjusting the shooting distance or angle of the camera, but in order to prevent the pixel pitch from being recorded on the image sensor, secure a sufficient distance between the LED video wall and camera. even when the distance secured, the zoom lens used in the broadcasting studio must be operated by appropriately changing the magnification. If the focal length is changed by changing the magnification to obtain a desired angle of view, the pixel pitch may be unintentionally recorded. In this study we propose the range that the pixel pitch is not observed while changing the magnification ratio of the zoom lens when the distance from the video wall is sufficiently secured. The content was played back on the LED video wall and the LED video wall was recorded on the server using an HD camera equipped with a B4 mount zoom lens

Remote practice of AVR system (AVR 시스템의 원격 실습방법)

  • Kim, Byun-Gon;Baek, Jong-Deuk;Kim, Myung-Soo;Jeong, Kyeong-Taek;kwon, Oh-Shin
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.10a
    • /
    • pp.751-753
    • /
    • 2017
  • In this paper, we implement remote training kit using camera, Arduino and AVR practice kit so that AVR practice kit can be practiced remotely. Implemented systems can be practiced by a large number of users one at a time from a remote location. The practitioner creates the AVR Studio program using the PC remote control method and downloads it to the AVR training kit. When a computer program is created and a mouse is clicked or dragged, the input signal is transmitted to the Arduino and the Arduino transmits the actual button input signal or the analog voltage to the AVR kit. When the AVR kit is activated by receiving the input signal, you can check the operation through the camera. Therefore, using the implemented system, a plurality of users can perform AVR training using one kit.

  • PDF