• Title/Summary/Keyword: Omni-directional lens

Search Result 12, Processing Time 0.017 seconds

3D Analysis of Scene and Light Environment Reconstruction for Image Synthesis (영상합성을 위한 3D 공간 해석 및 조명환경의 재구성)

  • Hwang, Yong-Ho;Hong, Hyun-Ki
    • Journal of Korea Game Society
    • /
    • v.6 no.2
    • /
    • pp.45-50
    • /
    • 2006
  • In order to generate a photo-realistic synthesized image, we should reconstruct light environment by 3D analysis of scene. This paper presents a novel method for identifying the positions and characteristics of the lights-the global and local lights-in the real image, which are used to illuminate the synthetic objects. First, we generate High Dynamic Range(HDR) radiance map from omni-directional images taken by a digital camera with a fisheye lens. Then, the positions of the camera and light sources in the scene are identified automatically from the correspondences between images without a priori camera calibration. Types of the light sources are classified according to whether they illuminate the whole scene, and then we reconstruct 3D illumination environment. Experimental results showed that the proposed method with distributed ray tracing makes it possible to achieve photo-realistic image synthesis. It is expected that animators and lighting experts for the film and animation industry would benefit highly from it.

  • PDF

Coordinates Transformation and Correction Techniques of the Distorted Omni-directional Image (왜곡된 전 방향 영상에서의 좌표 변환 및 보정)

  • Cha, Sun-Hee;Park, Young-Min;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • v.9 no.1
    • /
    • pp.816-819
    • /
    • 2005
  • This paper proposes a coordinate correction technique using the transformation of 3D parabolic coordinate function and BP(Back Propagation) neural network in order to solve space distortion problem caused by using catadioptric camera. Although Catadioptric camera can obtain omni-directional image at all directions of 360 degrees, it makes an image distorted because of an external form of lens itself. Accordingly, To obtain transformed ideal distance coordinate information from distorted image on 3 dimensional space, we use coordinate transformation function that uses coordinates of a focus at mirror in the shape of parabolic plane and another one which projected into the shape of parabolic from input image. An error of this course is modified by BP neural network algorithm.

  • PDF