• Title/Summary/Keyword: virtual screen interface

Search Result 31, Processing Time 0.027 seconds

Experiencing the 3D Color Environment: Understanding User Interaction with a Virtual Reality Interface (3차원 가상 색채 환경 상에서 사용자의 감성적 인터랙션에 관한 연구)

  • Oprean, Danielle;Yoon, So-Yeon
    • Science of Emotion and Sensibility
    • /
    • v.13 no.4
    • /
    • pp.789-796
    • /
    • 2010
  • The purpose of this study was to test a large screen and rear-projected virtual reality (VR) interface in color choice for environmental design. The study piloted a single three-dimensional model of a bedroom including furniture in different color combinations. Using a mouse with an $8'{\times}6'$ rear-projector screen, participants could move 360 degree motion in each room. The study used 34 college students who viewed and interacted with virtual rooms projected on a large screen, then filled out a survey. This study aimed to understand the interaction between the users and the VR interface through measurable dimensions of the interaction: interest and user perceptions of presence and emotion. Specifically, the study focused on spatial presence, topic involvement, and enjoyment. Findings should inform design researchers how empirical evidence involving environmental effects can be obtained using a VR interface and how users experience the interaction with the interface.

  • PDF

LCD Module Initialization and Panel Display for the Virtual Screen of LN2440SBC Embedded Systems (LN2440SBC 임베디드 시스템의 가상 스크린을 위한 LCD 모듈 초기화 및 패널 디스플레이)

  • Oh, Sam-Kweon;Park, Geun-Duk;Kim, Byoung-Kuk
    • Journal of Advanced Navigation Technology
    • /
    • v.14 no.3
    • /
    • pp.452-458
    • /
    • 2010
  • In case of an embedded system with computing resource restrictions such as system power and cpu, the overhead due to displaying data on the computer screen may have a significant influence on the system performance. This paper describes an initialization method for LCD-driving components such as an ARM Core, an LCD controller, and an SPI(serial peripheral interface). It also introduces a pixel display function and a panel display method using virtual screen for reducing the display overhead for an LN2440SBC system with an ARM9-based S3C2440A microprocessor. A virtual screen is a large space of computer memories allocated much larger than those needed for one-time display of an image. Displaying a specific region of a virtual screen is done by assigning it as a view-port region. Such a display is useful in an embedded system when concurrently running tasks produce and display their respective results on the screen; it is especially so when the execution result of each task is partially modified, instead of being totally modified, on its turn and displayed. If the tasks running on such a system divide and make efficient use of the region of the virtual screen, the display overhead can be minimized. For the performance comparison with and without using the virtual screen, two different images are displayed in turn and the amount of time consumed for their display is measured. The result shows that the display time of the former is about 5 times faster than that of the latter.

A Mixed Reality Based Interface for Planing Layouts (공간 배치를 위한 혼합현실 기반의 인터페이스)

  • Kang, Hyun;Lee, Gun A.;Son, Wook-Ho
    • Journal of the HCI Society of Korea
    • /
    • v.2 no.2
    • /
    • pp.45-51
    • /
    • 2007
  • Space planning is one of the popular applications of VR technology including interior design, architecture design, and factory layout. In order to provide easier methods to accommodate physical objects into virtual space planning task, we suggest applying mixed reality (MR) interface. We describe our hardware and software of our MR system designed according to requirements of the application domain. In brief, our system hardware consists of a video see-through display with a touch screen interface, mounted on a mobile platform, and we use screen space 3D manipulations to arrange virtual objects within the MR scene. Investigating the interface with our prototype implementation, we are convinced that our system will help users to design spaces in more easy and effective way.

  • PDF

Virtual Screen Interface using Motion Vector in Mobile Environment (영상 기반의 움직임 벡터를 이용한 모바일 환경에서의 가상 화면 제어 인터페이스)

  • Kim, Dong-Chul;Lim, Sang-Oh;Han, Tack-Don
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.993-998
    • /
    • 2006
  • 본 논문은 기존의 휴대폰과 같은 모바일 환경에서의 제약적인 화면을 극복하기 위하여 가상의 스크린을 만들고 이를 사용자가 손쉽게 제어할 수 있는 인터페이스에 대해서 제안하였다. 기존의 방법은 버튼을 반복적으로 눌러 스크롤 했던 것에 반해 제안된 인터페이스는 휴대폰 카메라를 통해 입력되는 영상 신호를 기반으로 움직임 벡터 값을 추출하고 이를 통하여 사용자가 움직이는 방향대로 화면을 움직여 줌으로서 사용자에게 편리하고 직관적인 인터페이스를 제공함으로서 휴대폰의 제한적인 화면을 극복 하였다. 모바일 환경에서의 움직임 벡터 값을 추출하기 위한 알고리즘을 제안하고, 제한적인 화면으로 인하여 정보 표현에 한계가 있던 점을 극복하여 모바일 환경에서의 새로운 인터페이스를 제안하였다.

  • PDF

Development of a Control and Virtual Realty Visual System for the Tilting Train Simulator (틸팅 차량용 시뮬레이터 적용을 위한 통제 및 가상현실 영상 시스템 개발)

  • Song Young-Soo;Han Seong-Ho;Kim Jung-Seok
    • Journal of the Korean Society for Railway
    • /
    • v.8 no.4
    • /
    • pp.330-336
    • /
    • 2005
  • This paper presents a development of the control and the virtual reality visual system for a tilting train simulator. The user of the tilting train simulator is able to set up the environmental and operating conditions through the user interface provided by the control system. In the control system, an arbitrary track which has user-defined curve radius, length and direction can be generated. The virtual reality visual system provides an artificial environment that is composed of several facilities such as station, platform, track, bridge, tunnel and signaling system. In order to maximize the reality, all of the 3D modeling were based on the real photographs taken in the Jungang line. A dome screen with 1600mm diameter was used to maximize the view angle. The hemispherical screen can ensure the view angle of the 170 degrees of vertical direction and 135 degrees of lateral direction.

An Analysis of Mobile Virtual Manipulatives Apps for the Teaching of Elementary School Mathematics (초등학교 수학의 교수를 위한 모바일 가상조작물 앱 분석)

  • Shin, Mikyung
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.6
    • /
    • pp.935-949
    • /
    • 2017
  • The purpose of this study was to analyze the characteristics of virtual manipulatives apps that can be used to teach students struggling to learn mathematics. To achieve this goal, ten general characteristics of 23 virtual manipulatives apps were evaluated. The instructional, interface, and interactive design features of apps were also evaluated on five-point scale ratings of 18 items. In addition, SPSS frequency analysis and the correlation between each feature was analyzed. Frequently presented instructional contents among 23 virtual manipulatives apps were geometry, arithmetic operation, number concept and measurement. The frequently presented level of instructional contents was lower grade elementary school and kindergarten age. The frequently presented instructional type was the simulation. Regarding the design features, instructional design was rated as the highest (mean = 3.7); interactive design (mean = 3.6) and interface design (mean = 3.3) were also rated higher than neural. In addition, as the learning strategy was appropriately presented, it was evaluated that there was less screen linkage and content error.

Controlling Position of Virtual Reality Contents with Mouth-Wind and Acceleration Sensor

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.4
    • /
    • pp.57-63
    • /
    • 2019
  • In this paper, we propose a new framework to control VR(Virtual reality) contents in real time using user's mouth-wind and acceleration sensor of mobile device. In VR, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. We propose a new interface technology that can interact with VR contents in real time using user's mouth-wind method with acceleration sensor. The direction of the mouth-wind is determined using the angle and position between the user and the mobile device, and the control position is adjusted using the acceleration sensor of the mobile device. Noise included in the size of the mouth wind is refined using a simple average filter. In order to demonstrate the superiority of the proposed technology, we show the result of interacting with contents in game and simulation in real time by applying control position and mouth-wind external force to the game.

The User Interface of Button Type for Stereo Video-See-Through Device (Stereo Video-See-Through 장치를 위한 버튼형 인터페이스)

  • Choi, Young-Ju;Seo, Yong-Duek
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2007.06c
    • /
    • pp.217-222
    • /
    • 2007
  • This paper proposes a user interface, on see-through system environment which shows the images via two different cameras, that also ordinary user can control the computer systems or other various processes easily. For that, we include an AR technology to synthesize the virtual button to the image which is captured by the camera real-time. And we were looking for the hand position in the image to judge whether the figure selects the button. And the result of judgment visualizes through changing of the button color. The user can easily interact with the system by selecting the virtual button in the screen with watching the screen and moving her fingers at the air.

  • PDF

The User Interface of Button Type for Stereo Video-See-Through (Stereo Video-See-Through를 위한 버튼형 인터페이스)

  • Choi, Young-Ju;Seo, Young-Duek
    • Journal of the Korea Computer Graphics Society
    • /
    • v.13 no.2
    • /
    • pp.47-54
    • /
    • 2007
  • This paper proposes a user interface based on video see-through environment which shows the images via stereo-cameras so that the user can control the computer systems or other various processes easily. We include an AR technology to synthesize virtual buttons; the graphic images are overlaid on the captured frames taken by the camera real-time. We search for the hand position in the frames to judge whether or not the user selects the button. The result of judgment is visualized through changing of the button color. The user can easily interact with the system by selecting the virtual button in the screen with watching the screen and moving her fingers at the air.

  • PDF

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.