• Title/Summary/Keyword: human-computer interface

Search Result 506, Processing Time 0.03 seconds

A Study on Efficient Design of Surveillance RADAR Interface Control Unit in Naval Combat System

  • Dong-Kwan Kim;Dong-Han Jung;Won-Seok Jang;Young-San Kim;Hyo-Jo Lee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.11
    • /
    • pp.125-134
    • /
    • 2023
  • In this paper, we propose an efficient surveillance RADAR(RAdio Detection And Ranging) interface control unit(ICU) design in the naval combat system. The proposed design applied a standardized architecture for modules that can be shared in ship combat system software. An error detection function for each link was implemented to increase the recognition speed of disconnection. Messages that used to be sent periodically for human-computer interaction(HCI) are now only transmitted when there is a change in the datagram. This can reduce the processing load of the console. The proposed design supplements the radar with the waterfall scope and time-limited splash recognition in relation to the hit check and zeroing of the shot when the radar processing ability is low due to the adoption of a low-cost commercial radar in the ship. Therefore, it is easy for the operator to determine whether the shot is hit or not, the probability of wrong recognition can be reduced, and the radar's resources can be obtained more effectively.

Development of User Interface Design Guidelines for Education Software Designers (교육용 소프트웨어 설계자를 위한 사용자 인터페이스 설계지침 개발)

  • Yun, Cheol-Ho
    • Journal of the Ergonomics Society of Korea
    • /
    • v.22 no.3
    • /
    • pp.45-56
    • /
    • 2003
  • This study was conducted to develop user interface design guidelines for those who design education software products (web sites or CD-ROM titles). To establish this guideline scheme, international standards, commercial design guidelines, and research papers were surveyed. Especially, ISO 9241 was referred as a basic model of a guideline scheme. First, the research group developed draft guidelines. After that, education software developers, designers, and a user group reviewed the draft and the draft was revised with their commentations. Five components were selected as a primary class of guideline scheme: general principle, dialogue design, user guidance, visual interface, and information presentation. Each component was divided several components as a secondary class. Finally, 45 items were selected as user interface design guidelines for the education software design.

An Interactive Voice Web Browser Usable as a Multimodal Interface in Information Devices by Using VoiceXML

  • Jang, Min-Seok
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.6
    • /
    • pp.771-775
    • /
    • 2004
  • The present Web surroundings is mostly composed of HTML(Hypertext Mark-up Language) and thereby users obtain web informations mainly in GUI(Graphical User Interface) environment by clicking mouse in order to keep up with hyperlinked informations. However it is very inconvenient to work in this environment comparing with easily accessed one in which human`s voice is utilized for obtaining informations. Using VoiceXML, resulted from XML, for supplying the information through telephone on the basis of the contemporary matured technology of voice recognition/synthesis to work out the inconvenience problem, this paper presents the research results about VoiceXML VUI(Voice User Interface) Browser designed and implemented for realizing its technology and also the VoiceXML Dialog designed for the purpose of the browser's efficient use.

A Real time Internet Game Played with a Brain-Computer Interfaced Animal (뇌-기계접속 된 동물과 사람사이의 실시간 인터넷게임)

  • Lee, H.J.;Kim, D.H.;Lang, Y.R.;Han, S.H.;Kim, Y.B.;Lee, G.S.;Lee, E.J.;Song, C.G.;Shin, H.C.
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02a
    • /
    • pp.780-783
    • /
    • 2007
  • A Many studies have been made on the prediction of human voluntary movement intention in real-time based on invasive or non-invasive methods to help severely motor-disabled persons by offering some abilities of motor controls and communications. In the present study, we have developed an internet game driven by and/or linked to a brain-computer interface (BCI) system. Activities of two single neuronal units recorded from either hippocampus or prefrontal cortex of SD rats were used in real time to control two-dimensional movements of a robot, or a game object.

  • PDF

Multilingual automatic generation of computer system message (컴퓨터메세지의 다국어자동생성)

  • Choi, Suk-Doo
    • Journal of the Korean Society for information Management
    • /
    • v.3 no.2
    • /
    • pp.3-17
    • /
    • 1986
  • Computer system messages play a critical role for users in understanding the system. Messages that are inaccurate or difficult to understand make it more difficult for users to correct errors and increase the chances of further errors. It is an important part of the human interface. In this paper, the generation of Korean and Japanese computer system messages at various levels of politeness is presented as an example of improvement for the man-machine interface.

  • PDF

ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

  • Seong, Poong Hyun;Kang, Hyun Gook;Na, Man Gyun;Kim, Jong Hyun;Heo, Gyunyoung;Jung, Yoensub
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.125-140
    • /
    • 2013
  • This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

Computer Vision Based Efficient Control of Presentation Slides (컴퓨터비전에 기반한 효율적인 프리젠테이션 슬라이드 제어)

  • 박정우;석민수;이준호
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.40 no.4
    • /
    • pp.232-239
    • /
    • 2003
  • This paper discusses the design and implementation of a human-oriented interface based on computer vision that efficiently controls presentation slides. The user does not have to be confined to a keyboard or mouse any more, and can move around more freely because slides for presentation can be up and down using a general laser pointer that is used for presentation. Regions for virtual buttons are set on the slide so that the user can conveniently point the buttons using the laser pointer. We have proposed a simple and efficient method that computes the button areas in the image without complicated calibration. The proposed method has been implemented based on Microsoft PowerPoint ; moreover it can be applied to other PowerPoint-like presentation softwares. Our method for human-centered slide control enables the user to give audiences a more interactive presentation in a natural way.

W3C based Interoperable Multimodal Communicator (W3C 기반 상호연동 가능한 멀티모달 커뮤니케이터)

  • Park, Daemin;Gwon, Daehyeok;Choi, Jinhuyck;Lee, Injae;Choi, Haechul
    • Journal of Broadcast Engineering
    • /
    • v.20 no.1
    • /
    • pp.140-152
    • /
    • 2015
  • HCI(Human Computer Interaction) enables the interaction between people and computers by using a human-familiar interface called as Modality. Recently, to provide an optimal interface according to various devices and service environment, an advanced HCI method using multiple modalities is intensively studied. However, the multimodal interface has difficulties that modalities have different data formats and are hard to be cooperated efficiently. To solve this problem, a multimodal communicator is introduced, which is based on EMMA(Extensible Multimodal Annotation Markup language) and MMI(Multimodal Interaction Framework) of W3C(World Wide Web Consortium) standards. This standard based framework consisting of modality component, interaction manager, and presentation component makes multiple modalities interoperable and provides a wide expansion capability for other modalities. Experimental results show that the multimodal communicator is facilitated by using multiple modalities of eye tracking and gesture recognition for a map browsing scenario.

Interface of Interactive Contents using Vision-based Body Gesture Recognition (비전 기반 신체 제스처 인식을 이용한 상호작용 콘텐츠 인터페이스)

  • Park, Jae Wan;Song, Dae Hyun;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.2
    • /
    • pp.40-46
    • /
    • 2012
  • In this paper, we describe interactive contents which is used the result of the inputted interface recognizing vision-based body gesture. Because the content uses the imp which is the common culture as the subject in Asia, we can enjoy it with culture familiarity. And also since the player can use their own gesture to fight with the imp in the game, they are naturally absorbed in the game. And the users can choose the multiple endings of the contents in the end of the scenario. In the part of the gesture recognition, KINECT is used to obtain the three-dimensional coordinates of each joint of the limb to capture the static pose of the actions. The vision-based 3D human pose recognition technology is used to method for convey human gesture in HCI(Human-Computer Interaction). 2D pose model based recognition method recognizes simple 2D human pose in particular environment On the other hand, 3D pose model which describes 3D human body skeletal structure can recognize more complex 3D pose than 2D pose model in because it can use joint angle and shape information of body part Because gestures can be presented through sequential static poses, we recognize the gestures which are configured poses by using HMM In this paper, we describe the interactive content which is used as input interface by using gesture recognition result. So, we can control the contents using only user's gestures naturally. And we intended to improve the immersion and the interest by using the imp who is used real-time interaction with user.

  • PDF

EEG Signals Measurement and Analysis Method for Brain-Computer Interface (뇌와 컴퓨터의 인터페이스를 위한 뇌파 측정 및 분석 방법)

  • Sim, Kwee-Bo;Yeom, Hong-Gi;Lee, In-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.5
    • /
    • pp.605-610
    • /
    • 2008
  • There are many methods for Human-Computer Interface. Recently, many researchers are studying about Brain-Signal this is because not only the disabled can use a computer by their thought without their limbs but also it is convenient to general people. But, studies about it are early stages. This paper proposes an EEG signals measurement and analysis methods for Brain-Computer Interface. Our purpose of this research is recognition of subject's intention when they imagine moving their arms. EEG signals are recorded during imaginary movement of subject's arms at electrode positions Fp1, Fp2, C3, C4. We made an analysis ERS(Event-Related Synchronization) and ERD(Event-Related Desynchronization) which are detected when people move their limbs in the ${\mu}$ waves and ${\beta}$ waves. Results of this research showed that ${\mu}$ waves are decreased and ${\beta}$ waves are increased at left brain during the imaginary movement of right hand. In contrast, ${\mu}$ waves are decreased and ${\beta}$ waves are increased at right brain during the imaginary movement of left hand.