• Title/Summary/Keyword: Information input device

Search Result 567, Processing Time 0.031 seconds

An Input Device to cognize the Human Motion in Ubiquitous Environment (유비쿼터스 환경에서 사용자 동작을 인식하기 위한 입력장치)

  • Jong-Woo Kim;JeongRae Kim;Soo Bin Jeon;Chongmyung Park;In-Bum Jung
    • Annual Conference of KIPS
    • /
    • 2008.11a
    • /
    • pp.839-842
    • /
    • 2008
  • 기존에 사용 중인 컴퓨터 입력기기인 키보드와 마우스는 유비쿼터스 환경에서의 입력기기로서는 휴대성과 직관성의 결여로 사용하기에 부적합하다. 유비쿼터스 환경에서는 휴대성과 직관성이 강화된 차별화된 입력기기가 필요하다. 본 연구에서는 유비쿼터스 환경에 맞는 입력기기에 사용 가능한 센서에 대해 실험하고 그 결과를 바탕으로 유비쿼터스 환경에 맞는 입력기기를 제안하고자 한다.

Laser-recognizable Screen and Gun with Laser Source for Realistic Big Screen First Person Shooters Games (대화면 FPS 게임을 위한 레이저센서기반의 대형스크린과 레이저광원 권총의 설계와 구현)

  • Han, Ngoc-Son;Kim, Seong-Whan
    • Annual Conference of KIPS
    • /
    • 2008.05a
    • /
    • pp.481-484
    • /
    • 2008
  • In this paper, we present a new game interface design for First Person Shooters (FPS). Previously, FPS on computer is commonly played using keyboard/mouse or joystick along with PC display. We improve the communication environment between player and game world by means of new control system including large screen and laser gun, which create a real life-like space for players. Because traditional display for FPS uses CRT, it cannot support large screen display due to limitation of CRT technology. We designed and implemented a new input device using laser recognizable display. Results suggest that the combined interface creates a method which helps beginners to enjoy playing a FPS immediately and gives experienced players a new gaming experience.

SmartPuck System : Tangible Interface for Physical Manipulation of Digital Information (스마트 퍽 시스템 : 디지털 정보의 물리적인 조작을 제공하는 실감 인터페이스 기술)

  • Kim, Lae-Hyun;Cho, Hyun-Chul;Park, Se-Hyung
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.13 no.4
    • /
    • pp.226-230
    • /
    • 2007
  • In the conventional desktop PC environment, keyboard and mouse are used to process the user input and monitor displays the visual information as an output device. In order to manipulate the digital information, we move the virtual cursor to select the desired graphical icon on the monitor The cursor represents the relative motion of the physical mouse on the desk. This desktop metaphor does not provide intuitive interface through human sensation. In this paper, we introduce a novel tangible interface which allows the user to interact with computers using a physical tool called "Smartpuck". SmartPuck system bridges the gap between analog perception and response in human being and digital information on the computer. The system consists of table display based on a PDP, SmartPuck equipped with rotational part and button for the user's intuitive and tactile input, and a sensing system to track the position of SmartPuck. Finally, we will show examples working with the system.

Smart Fire Image Recognition System using Charge-Coupled Device Camera Image (CCD 카메라 영상을 이용한 스마트 화재 영상 인식 시스템)

  • Kim, Jang-Won
    • Fire Science and Engineering
    • /
    • v.27 no.6
    • /
    • pp.77-82
    • /
    • 2013
  • This research suggested smart fire recognition system which trances firing location with CCD camera with wired/wire-less TCP/IP function and Pan/Tilt function, delivers information in real time to android system installed by smart mobile communication system and controls fire and disaster remotely. To embody suggested method, firstly, algorithm which applies hue saturation intensity (HSI) Transform for input video, eliminates surrounding lightness and unnecessary videos and segmentalized only firing videos was suggested. Secondly, Pan/Tilt function traces accurate location of firing for proper control of firing. Thirdly, android communication system installed by mobile function confirms firing state and controls it. To confirm the suggested method, 10 firing videos were input and experiment was conducted. As the result, all of 10 videos segmentalized firing sector and traced all of firing locations.

Research on establishment of the network system of teaching and learning material for the organizations linking to vocational education (직업교육 유관 기관간 교수·학습자료 공유 시스템 구축에 관한 연구)

  • Kim, Sun-Tae
    • 대한공업교육학회지
    • /
    • v.30 no.1
    • /
    • pp.133-148
    • /
    • 2005
  • This research attempts to establish a strategy through which to secure the introduction of the KEM 2.0 system. Another aim of this research is to eventually establish a service system which can be used to automatically generate and provide the metadata information contained in the Cylearn system. in order to facilitate Korean vocational high school students access to teaching-learning materials. The main research tasks associated with this research were: 1) To establish the components of the file server system while taking into consideration the environment in which each educational organization operates; 2) To utilize the KEM 2.0 in order to optimize the configuration of the Cylearn system and the related software. To do so, the KEM 2.0. should be applied in close coordination with Edunet: The results of this research can be summarized as follows: First, a strategy to introduce the KEM 2.0 was established. To achieve this, the researcher analyzed the characteristics of the sequencing and presentation methods and suggested teaching-learning materials based on the KEM 2.0.s sequencing system. Second, the file server was constructed using the KEM 2.0. The established file server took into account the environmental conditions in which the Edunet system operates so as to facilitate the creation of a network system with Edunet. Third, the Cylearn service system was linked to Edunet. The researcher developed a module using the KEM 2.0. that could be used to transmit the metadata related to teaching-learning materials to Edunet. To achieve this, an input device and a databank which could be used to transfer the generated metadata to Edunet were developed. The input device and databank developed using the KEM 2.0. were used to transfer the metadata to Edunet by linking the Cylearn system to the Edunet system.

Design of a 7-bit 2GSPS Folding/Interpolation A/D Converter with a Self-Calibrated Vector Generator (자체보정 벡터 발생기를 이용한 7-bit 2GSPS A/D Converter의 설계)

  • Kim, Seung-Hun;Kim, Dae-Yun;Song, Min-Kyu
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.48 no.4
    • /
    • pp.14-23
    • /
    • 2011
  • In this paper, a 7-bit 2GSPS folding/interpolation A/D Converter(ADC) with a Self-Calibrated Vector Generator is proposed. The ADC structure is based on a folding/interpolation architecture whose folding/interpolation rate is 4 and 8, respectively. A cascaded preprocessing block is not only used in order to drive the high input signal frequency, but the resistive interpolation is also used to reduce the power consumption. Based on a novel self-calibrated vector generator, further, offset errors due to device mismatch, parasitic resistors. and parasitic capacitance can be reduced. The chip has been fabricated with a 1.2V 0.13um 1-poly 7-metal CMOS technology. The effective chip area including the calibration circuit is 2.5$mm^2$. SNDR is about 39.49dB when the input frequency is 9MHz at 2GHz sampling frequency. The SNDR is improved by 3dB with the calibration circuit.

Volume Control using Gesture Recognition System

  • Shreyansh Gupta;Samyak Barnwal
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.6
    • /
    • pp.161-170
    • /
    • 2024
  • With the technological advances, the humans have made so much progress in the ease of living and now incorporating the use of sight, motion, sound, speech etc. for various application and software controls. In this paper, we have explored the project in which gestures plays a very significant role in the project. The topic of gesture control which has been researched a lot and is just getting evolved every day. We see the usage of computer vision in this project. The main objective that we achieved in this project is controlling the computer settings with hand gestures using computer vision. In this project we are creating a module which acts a volume controlling program in which we use hand gestures to control the computer system volume. We have included the use of OpenCV. This module is used in the implementation of hand gestures in computer controls. The module in execution uses the web camera of the computer to record the images or videos and then processes them to find the needed information and then based on the input, performs the action on the volume settings if that computer. The program has the functionality of increasing and decreasing the volume of the computer. The setup needed for the program execution is a web camera to record the input images and videos which will be given by the user. The program will perform gesture recognition with the help of OpenCV and python and its libraries and them it will recognize or identify the specified human gestures and use them to perform or carry out the changes in the device setting. The objective is to adjust the volume of a computer device without the need for physical interaction using a mouse or keyboard. OpenCV, a widely utilized tool for image processing and computer vision applications in this domain, enjoys extensive popularity. The OpenCV community consists of over 47,000 individuals, and as of a survey conducted in 2020, the estimated number of downloads exceeds 18 million.

Designing Mobile Framework for Intelligent Personalized Marketing Service in Interactive Exhibition Space (인터랙티브 전시 환경에서 개인화 마케팅 서비스를 위한 모바일 프레임워크 설계)

  • Bae, Jong-Hwan;Sho, Su-Hwan;Choi, Lee-Kwon
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.1
    • /
    • pp.59-69
    • /
    • 2012
  • As exhibition industry, which is a part of 17 new growth engines of the government, is related to other industries such as tourism, transportation and financial industries. So it has a significant ripple effect on other industries. Exhibition is a knowledge-intensive, eco-friendly and high value-added Industry. Over 13,000 exhibitions are held every year around the world which contributes to getting foreign currency. Exhibition industry is closely related with culture and tourism and could be utilized as local and national development strategies and improve national brand image as well. Many countries try various efforts to invigorate exhibition industry by arranging related laws and support system. In Korea, more than 200 exhibitions are being held every year, but only 2~3 exhibitions are hosted with over 400 exhibitors and except these exhibitions most exhibitions have few foreign exhibitors. The main reason of weakness of domestic trade show is that there are no agencies managing exhibitionrelated statistics and there is no specific and reliable evaluation. This might cause impossibility of providing buyer or seller with reliable data, poor growth of exhibitions in terms of quality and thus service quality of trade shows cannot be improved. Hosting a lot of visitors (Public/Buyer/Exhibitor) is very crucial to the development of domestic exhibition industry. In order to attract many visitors, service quality of exhibition and visitor's satisfaction should be enhanced. For this purpose, a variety of real-time customized services through digital media and the services for creating new customers and retaining existing customers should be provided. In addition, by providing visitors with personalized information services they could manage their time and space efficiently avoiding the complexity of exhibition space. Exhibition industry can have competitiveness and industrial foundation through building up exhibition-related statistics, creating new information and enhancing research ability. Therefore, this paper deals with customized service with visitor's smart-phone at the exhibition space and designing mobile framework which enables exhibition devices to interact with other devices. Mobile server framework is composed of three different systems; multi-server interaction, server, client, display device. By making knowledge pool of exhibition environment, the accumulated data for each visitor can be provided as personalized service. In addition, based on the reaction of visitors each of all information is utilized as customized information and so the cyclic chain structure is designed. Multiple interaction server is designed to have functions of event handling, interaction process between exhibition device and visitor's smart-phone and data management. Client is an application processed by visitor's smart-phone and could be driven on a variety of platforms. Client functions as interface representing customized service for individual visitors and event input and output for simultaneous participation. Exhibition device consists of display system to show visitors contents and information, interaction input-output system to receive event from visitors and input toward action and finally the control system to connect above two systems. The proposed mobile framework in this paper provides individual visitors with customized and active services using their information profile and advanced Knowledge. In addition, user participation service is suggested as well by using interaction connection system between server, client, and exhibition devices. Suggested mobile framework is a technology which could be applied to culture industry such as performance, show and exhibition. Thus, this builds up the foundation to improve visitor's participation in exhibition and bring about development of exhibition industry by raising visitor's interest.

Control Method of BIFS Contents for Mobile Devices with Restricted Input Key (제한적 키 입력을 갖는 휴대 단말에서의 BIFS 콘텐츠 제어방법)

  • Kim, Jong-Youn;Moon, Nam-Mee;Park, Joo-Kyung
    • Journal of Broadcast Engineering
    • /
    • v.15 no.3
    • /
    • pp.346-354
    • /
    • 2010
  • T-DMB is using MPEG-4 BIFS standard format for broadcasting interactive data service. BIFS enables us to represent contents as a scene which consists of various objects such as AV, image, graphic, and text. It also enables us to control objects by using user interaction. BIFS was designed to be adapted to multimedia systems with various input devices. Today, however, we are in lack of considering about mobile device with restricted input unit. The problem is that a consistent user control of interactive data contents is not possible due to the limitations of input units in T-DMB terminals. To solve the problem, we defined KeyNavigator node that provides a means to select or navigate objects (like menu) in BIFS contents by arrow keys and enter key of mobile terminal. By using KeyNavigater node, not only BIFS contents providers can make BIFS contents as they want, but also users can get a way to control BIFS contents consistently and easily.

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.