• Title/Summary/Keyword: intelligent interface

Search Result 649, Processing Time 0.026 seconds

Mobile Terminal-Based User Interface for Intelligent Robots (휴대용 단말기 기반의 재능 로봇 사용자 인터페이스)

  • Kim Gi-Oh;Xuan Pham Dai;Park Ji-Hwan;Hong Soon-Hyuk;Jeon Jae-Wook
    • The KIPS Transactions:PartB
    • /
    • v.13B no.2 s.105
    • /
    • pp.179-186
    • /
    • 2006
  • A user interface that connects a user to intelligent robots needs to be designed for executing them efficiently. In this paper, it is analyzed how to organize a mobile terminal based user interface according to the function and level of autonomy of intelligent robots and the user interface of PDA (Personal Digital Assistant) and smart phone is developed for controlling intelligent robots remotely. In the image-based user interface, a user can see the motion of a robot directly and control the robot. In the map-based interface, the quantity of transmission information is reduced and therefore a user can control the robot with a small delay of transmission time.

Intelligent interface model for B2B electronic commerce negotiation (B2B 상거래 협상을 위한 지능형 인터페이스 모델)

  • 임기영;고성범;조용대
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.05a
    • /
    • pp.195-198
    • /
    • 2001
  • The electronic commerce systems has been emerge in the field of commerce by making it possible to sell goods without the restriction of time and space. As the business based on the electronic commerce is increased, many studies related to the electronic commerce have been presented. In conventional electronic commerce systems, the buyers purchase some goods by using the information such as quality and price that are offered by the seller. In this paper we propose intelligent interface model that can satisfy the both sides.

  • PDF

Intelligent User Interface for Teleoperated Microassembly

  • Song, Eun-Ha;Kim, Deok-Ho;Kim, Kyunghwan;Lee, Jaehoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.168.2-168
    • /
    • 2001
  • Generally, operators suffer much difficulty in manipulating micro-nano-sized objects without assisting of human interface, due to the scaling effects in micro/nano world. Thus, the micromanipulation system based on the teleoperation techniques enables operators to manipulate the micro objects easily by transferring both human motion and manipulation skills to the micromanipulator. In teleoperated microassembly, it is necessary to provide a useful user interface for facilitating micro assembly task executed by human operators. In this paper, Intelligent User Interface (UI) for teleoperated microassembly is proposed. The proposed intelligent UI improves task performance by teleoperated micromanipulation as well as guides operators to succeed in desirable ...

  • PDF

Design of Multimedia Interface for Intelligent Tutoring System (지능형 교수 시스템 지원을 위한 멀티미디어 인터페이스의 설계)

  • Jung, Sang-Mok;Lee, Wan-Bok
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.575-579
    • /
    • 2006
  • Intelligent Tutoring System is composed of tutoring module, student module and expert module. Among the modules, interface module is the most closely related to students and shares the biggest part of a learning system. Interface module is important for students both in e-learning system but also in intelligent tutoring system. But research on the improvement of interface hasn't been actively done. It studied the interface improvement that has very close relation to e-Learning students. It studied the main components of the existing interface and designed multimedia interface to correct the problems of previous researches. It also suggested the method to relate it to the intelligent tutoring system.

  • PDF

Teleloperation of Field Mobile Manipulator with Wearable Haptic-based Multi-Modal User Interface and Its Application to Explosive Ordnance Disposal

  • Ryu Dongseok;Hwang Chang-Soon;Kang Sungchul;Kim Munsang;Song Jae-Bok
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.10
    • /
    • pp.1864-1874
    • /
    • 2005
  • This paper describes a wearable multi-modal user interface design and its implementation for a teleoperated field robot system. Recently some teleoperated field robots are employed for hazard environment applications (e.g. rescue, explosive ordnance disposal, security). To complete these missions in outdoor environment, the robot system must have appropriate functions, accuracy and reliability. However, the more functions it has, the more difficulties occur in operation of the functions. To cope up with this problem, an effective user interface should be developed. Furthermore, the user interface is needed to be wearable for portability and prompt action. This research starts at the question: how to teleoperate the complicated slave robot easily. The main challenge is to make a simple and intuitive user interface with a wearable shape and size. This research provides multi-modalities such as visual, auditory and haptic sense. It enables an operator to control every functions of a field robot more intuitively. As a result, an EOD (explosive ordnance disposal) demonstration is conducted to verify the validity of the proposed wearable multi-modal user interface.

Internet-Based Remote Control of the Intelligent Robot (지능형 로봇의 인터넷 기반 원격 제어)

  • Yu, Young-Sun;Kim, Jong-Sun;Kim, Hyong-Suk;Joo, Young-Hoon
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.3
    • /
    • pp.242-248
    • /
    • 2007
  • In this paper, we implement the internet-based remote control system for intelligent robot. For remote control of the robot, it uses the socket communication of the TCP/IP. It consists of the user interface and the robot control interface. Robot control interface transmits the navigation and environmental informations of the robot into the user interface. In order to transmit the large environmental images, a JPEG compression algorithm is used. User interface displays the navigation status of the robot and transmits the navigation order into the robot control interface. Also, we propose the design method of the fuzzy controller using navigation data acquired by expert's knowledge or experience. To do this, we use virus-evolutionary genetic algorithm(VEGA). Finally, we have shown the proposed system can be operated through the real world experimentations.

Human Indicator and Information Display using Space Human Interface in Networked Intelligent Space

  • Jin Tae-Seok;Niitsuma Mihoko;Hashimoto Hideki
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.5
    • /
    • pp.632-638
    • /
    • 2005
  • This paper describes a new data-handing, based on a Spatial Human Interface as human indicator, to the Spatial-Knowledge-Tags (SKT) in the spatial memory the Spatial Human Interface (SHI) is a new system that enables us to facilitate human activity in a working environment. The SHI stores human activity data as knowledge and activity history of human into the Spatial Memory in a working environment as three-dimensional space where one acts, and loads them with the Spatial-Knowledge-Tags(SKT) by supporting the enhancement of human activity. To realize this, the purpose of SHI is to construct new relationship among human and distributed networks computers and sensors that is based on intuitive and simultaneous interactions. In this paper, the specified functions of SKT and the realization method of SKT are explained. The utility of SKT is demonstrated in designing a robot motion control.

Dynamic Walking Control of Biped Walking Robot using Intelligent Control Method and Sensor Interface (지능형 제어기법 및 센서 인터페이스를 이용한 이족 보행 로봇의 동적보행 제어)

  • Kho, Jaw-Won;Lim, Dong-Cheol
    • The Transactions of the Korean Institute of Electrical Engineers P
    • /
    • v.56 no.4
    • /
    • pp.161-167
    • /
    • 2007
  • This paper introduces a dynamic walking control of biped walking robot using intelligent sensor interface and shows an intelligent control method for biped walking robot. For the dynamic walking control of biped walking robot, serious motion controllers are used. They are main controller(using INTEL80C296SA MPU), sub controller(using TMS320LF2406 DSP), sensor controller(using Atmega128 MPU) etc. The used sensors are gyro sensor, tilt sensor, infrared sensor, FSR sensor etc. For the feasibility of a dynamic walking control of biped walking robot, we use the biped walking robot which has twenty-five degrees of freedom(D.O.F.) in total. Our biped robot is composed of two legs of six D.O.F. each, two arms of five D.O.F. each, a waist of two D.O.F., a head of one D.O.F.

A Full Body Gumdo Game with an Intelligent Cyber Fencer using Multi-modal(3D Vision and Speech) Interface (멀티모달 인터페이스(3차원 시각과 음성 )를 이용한 지능적 가상검객과의 전신 검도게임)

  • 윤정원;김세환;류제하;우운택
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.9 no.4
    • /
    • pp.420-430
    • /
    • 2003
  • This paper presents an immersive multimodal Gumdo simulation game that allows a user to experience the whole body interaction with an intelligent cyber fencer. The proposed system consists of three modules: (i) a nondistracting multimodal interface with 3D vision and speech (ii) an intelligent cyber fencer and (iii) an immersive feedback by a big screen and sound. First, the multimodal Interface with 3D vision and speech allows a user to move around and to shout without distracting the user. Second, an intelligent cyber fencer provides the user with intelligent interactions by perception and reaction modules that are created by the analysis of real Gumdo game. Finally, an immersive audio-visual feedback by a big screen and sound effects helps a user experience an immersive interaction. The proposed system thus provides the user with an immersive Gumdo experience with the whole body movement. The suggested system can be applied to various applications such as education, exercise, art performance, etc.

An Intelligent Search Modeling using Avatar Agent

  • Kim, Dae Su
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.3
    • /
    • pp.288-291
    • /
    • 2004
  • This paper proposes an intelligent search modeling using avatar agent. This system consists of some modules such as agent interface, agent management, preprocessor, interface machine. Core-Symbol Database and Spell Checker are related to the preprocessor module and Interface Machine is connected with Best Aggregate Designer. Our avatar agent system does the indexing work that converts user's natural language type sentence to the proper words that is suitable for the specific branch information retrieval. Indexing is one of the preprocessing steps that make it possible to guarantee the specialty of user's input and increases the reliability of the result. It references a database that consists of synonym and specific branch dictionary. The resulting symbol after indexing is used for draft search by the internet search engine. The retrieval page position and link information are stored in the database. We experimented our system with the stock market keyword SAMSUNG_SDI, IBM, and SONY and compared the result with that of Altavista and Google search engine. It showed quite excellent results.