• 제목/요약/키워드: virtual views

검색결과 87건 처리시간 0.022초

증강현실용 오픈소스를 이용한 위치정보 서비스 앱의 개발 (Development of Location Information Service App Using an Open Source for Augmented Reality)

  • 손정기;주복규
    • 한국인터넷방송통신학회논문지
    • /
    • 제13권1호
    • /
    • pp.267-272
    • /
    • 2013
  • 증강현실은 가상현실의 한 분야로 실제 환경에 가상 사물을 합성해 원래의 환경에 존재하는 사물처럼 보이도록 하는 컴퓨터 그래픽 기법이다. 증강현실은 스마트 폰이 사람들에게 널리 보급되면서 우리에게 친숙하게 다가왔고 이를 이용한 다양한 애플리케이션은 사람들에게 선풍적인 인기를 끌고 있다. 우리는 오픈 소스인 Mixare를 이용하여 위치정보를 실시간으로 서비스하는 증강현실 기술을 이용한 애플리케이션을 개발하였다. 이 논문은 앤드로이드 폰용 위치정보 서비스 애플리케이션 '홍대에 가면~'의 개발에 대하여 기술한다. 이 앱은 스마트 폰으로 학교주변의 건물을 비추면 그 건물에 관한 상세 정보를 실시간으로 화면에 겹쳐 보여준다.

2차원/3차원 자유시점 비디오 재생을 위한 가상시점 합성시스템 (Virtual View Rendering for 2D/3D Freeview Video Generation)

  • 민동보;손광훈
    • 대한전자공학회논문지SP
    • /
    • 제45권4호
    • /
    • pp.22-31
    • /
    • 2008
  • 3DTV를 위한 핵심 기술 중의 하나인 다시점 영상에서 변이를 추정하고 가상시점을 합성하는 새로운 방식을 제안한다. 다시점 영상에서 변이를 효율적이고 정확하게 추정하기 위해 준 N-시점 & N-깊이 구조를 제안한다. 이 구조는 이웃한 영상의 정보를 이용하여 변이 추정 시 발생하는 계산상의 중복을 줄인다. 제안 방식은 사용자에게 2D와 3D 자유시점을 제공하며, 사용자는 자유시점 비디오의 모드를 선택할 수 있다. 실험 결과는 제안 방식이 정확한 변이 지도를 제공하며, 합성된 영상이 사용자에게 자연스러운 자유시점 비디오를 제공한다는 것을 보여준다.

실내디자인의 지각적 프리젠테이션 방법의 특성에 관한 연구 (A Study on Characteristics of Perceptual Presentation Methods of Interior Design)

  • 이종란
    • 한국실내디자인학회논문집
    • /
    • 제29호
    • /
    • pp.265-272
    • /
    • 2001
  • The perceptual presentation of interior design is to represent an interior space planned by a designer as if people see it in reality. The perceptual presentation methods that have developed are perspectives, full-scathe models, small-scale models, photography of models, video taping of models, computer images, computer animation, and virtual reality. The purpose of this study is to investigate limits of those perceptual presentation methods according to their characteristics. The methods have characteristics that are either static or dynamic and either monoscopic or stereoscopic. In terms of representing interior spaces and perceiving interior spaces, the dynamic characteristic is more helpful than the static characteristic because the dynamic characteristic provides consecutively changing views of interior spaces when people walk around within the spaces. The stereoscopic characteristic is more helpful than the monoscopic characteristic because the stereoscopic characteristic provides the binocular depth perception. Full-scale models, small-scale models, virtual reality that have dynamic and stereoscopic characteristics, are most effective. The next effective methods are video taping of models and computer animation that have dynamic and monoscopic characteristics. The last effective methods are perspectives and photography each of models that haute static and monoscopic characteristics. But the most effective methods can nut be said that those are perfect because each of them still has limits. Designers have to consider the limits of each perceptual presentation method to find a way that shows their designs most effectively. To develop the perceptual presentation methods of interior design, researchers should focus on the helpful characteristics that are dynamic and stereoscopic.

  • PDF

MPEG 몰입형 비디오 기반 6DoF 영상 스트리밍 성능 분석 (Performance Analysis of 6DoF Video Streaming Based on MPEG Immersive Video)

  • 정종범;이순빈;김인애;류은석
    • 방송공학회논문지
    • /
    • 제27권5호
    • /
    • pp.773-793
    • /
    • 2022
  • 다수의 고품질 몰입형 영상 전송을 통해 가상 현실에서 six degrees of freedom(6DoF)를 지원하기 위해 the moving picture experts group (MPEG) immersive video (MIV) 압축 표준이 설립되었다. MIV는 비트율과 연산 복잡도 간 상충관계를 고려하여 1) 시점 간 연관성 제거 또는 2) 대표 시점을 선택하여 전송하는 2가지 압축 방식을 제공한다. 본 논문은 전술한 두 가지 방식에 대해 high-efficiency video coding (HEVC), versatile video coding (VVC) 기반 성능 분석 결과를 입력 영상 위치에 합성한 가상 영상 및 사용자 시점 영상 중심으로 제시한다.

Research on the Correlation Between the Alienation Effect and Immersion of Breaking the Fourth Wall in Games

  • Qi Yi;Jeanhun Chung
    • International journal of advanced smart convergence
    • /
    • 제12권4호
    • /
    • pp.328-333
    • /
    • 2023
  • Breaking the fourth wall is a very popular concept right now, and depictions of breaking the barrier between virtuality and reality are often used in game advertising. In VR games, game manufacturers describe the experience after breaking the fourth wall as an experiencer who will be completely immersed in the virtual world, as if they are actually living in the virtual world. At the same time, research in the field of traditional drama also shows that breaking the fourth wall can also bring a sense of alienation to the player, allowing the experiencer to clearly realize that he and the character are in a completely different world, and to conduct aesthetic criticism of related works of art.So why there are two completely different feelings after breaking the fourth wall will be the content of this article. This article will focus on the theoretical analysis of the relationship between two different cognitions and two completely different cognitions after breaking the fourth wall. Finally, it will be analyzed from three directions: game perspective, game art style, and different world views of the game. Finally, it was concluded that when players break the fourth wall in the game, these three factors will cause the experiencer to have two completely different cognitions.

Accuracy of virtual models in the assessment of maxillary defects

  • Kamburoglu, Kivanc;Kursun, Sebnem;Kilic, Cenk;Ozen, Tuncer
    • Imaging Science in Dentistry
    • /
    • 제45권1호
    • /
    • pp.23-29
    • /
    • 2015
  • Purpose: This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Materials and Methods: Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) $60{\times}60mm$ FOV, $0.125mm^3$ ($FOV_{60}$); 2) $80{\times}80mm$ FOV, $0.160mm^3$ ($FOV_{80}$); and 3) $100{\times}100mm$ FOV, $0.250mm^3$ ($FOV_{100}$). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. Results: A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. Conclusion: In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

A Study on "A Midsummer Night's Palace" Using VR Sound Engineering Technology

  • Seok, MooHyun;Kim, HyungGi
    • International Journal of Contents
    • /
    • 제16권4호
    • /
    • pp.68-77
    • /
    • 2020
  • VR (Virtual Reality) contents make the audience perceive virtual space as real through the virtual Z axis which creates a space that could not be created in 2D due to the space between the eyes of the audience. This visual change has led to the need for technological changes to sound and sound sources inserted into VR contents. However, studies to increase immersion in VR contents are still more focused on scientific and visual fields. This is because composing and producing VR sounds require professional views in two areas: sound-based engineering and computer-based interactive sound engineering. Sound-based engineering is difficult to reflect changes in user interaction or time and space by directing the sound effects, script sound, and background music according to the storyboard organized by the director. However, it has the advantage of producing the sound effects, script sound, and background music in one track and not having to go through the coding phase. Computer-based interactive sound engineering, on the other hand, is produced in different files, including the sound effects, script sound, and background music. It can increase immersion by reflecting user interaction or time and space, but it can also suffer from noise cancelling and sound collisions. Therefore in this study, the following methods were devised and utilized to produce sound for VR contents called "A Midsummer Night" so as to take advantage of each sound-making technology. First, the storyboard is analyzed according to the user's interaction. It is to analyze sound effects, script sound, and background music which is required according to user interaction. Second, the sounds are classified and analyzed as 'simultaneous sound' and 'individual sound'. Thirdly, work on interaction coding for sound effects, script sound, and background music that were produced from the simultaneous sound and individual time sound categories is done. Then, the contents are completed by applying the sound to the video. By going through the process, sound quality inhibitors such as noise cancelling can be removed while allowing sound production that fits to user interaction and time and space.

다시점 영상의 프레임율 변환 방식 분석 (Analysis of Frame Rate Up Conversion for Multi-View System)

  • 양윤모;이도훈;오병태
    • 한국방송∙미디어공학회:학술대회논문집
    • /
    • 한국방송공학회 2015년도 하계학술대회
    • /
    • pp.334-336
    • /
    • 2015
  • 본 논문은 다시점 영상에서 프레임율 변환 (Frame Rate Up Conversion, FRUC) 기법을 적용하는 방식에 대하여 서술한다. 먼저, 임의의 시점의 영상을 생성하기 위해서 사용되는 중간시점 영상합성 (View Synthesis, VS)기법과 연속되는 영상 사이에 새로운 영상을 만들어내는 프레임율 변환기법에 대하여 간략히 소개한다. 그리고, 기존의 프레임율 변환 방식과 다시점 영상의 특징을 이용한 프레임율 변환 방식을 소개하고 각각의 방식으로 생성한 영상의 화질을 비교한다.

  • PDF

TEST OF A LOW COST VEHICLE-BORNE 360 DEGREE PANORAMA IMAGE SYSTEM

  • Kim, Moon-Gie;Sung, Jung-Gon
    • 대한원격탐사학회:학술대회논문집
    • /
    • 대한원격탐사학회 2008년도 International Symposium on Remote Sensing
    • /
    • pp.137-140
    • /
    • 2008
  • Recently many areas require wide field of view images. Such as surveillance, virtual reality, navigation and 3D scene reconstruction. Conventional camera systems have a limited filed of view and provide partial information about the scene. however, omni directional vision system can overcome these disadvantages. Acquiring 360 degree panorama images requires expensive omni camera lens. In this study, 360 degree panorama image was tested using a low cost optical reflector which captures 360 degree panoramic views with single shot. This 360 degree panorama image system can be used with detailed positional information from GPS/INS. Through this study result, we show 360 degree panorama image is very effective tool for mobile monitoring system.

  • PDF

Novel View Generation Using Affine Coordinates

  • Sengupta, Kuntal;Ohya, Jun
    • 한국방송∙미디어공학회:학술대회논문집
    • /
    • 한국방송공학회 1997년도 Proceedings International Workshop on New Video Media Technology
    • /
    • pp.125-130
    • /
    • 1997
  • In this paper we present an algorithm to generate new views of a scene, starting with images from weakly calibrated cameras. Errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a direct method for reprojection. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. We borrow ideas from the object recognition literature, and extend them significantly to solve the problem of reprojection. Unlike the epipolar line intersection algorithms for reprojection which requires at least eight matched points across three images, we need only five matched points. The theory of reprojection is used with hardware based rendering to achieve fast rendering. We demonstrate our results of novel view generation from stereopairs for arbitrary locations of the virtual camera.

  • PDF