• 제목/요약/키워드: robotic vision

검색결과 126건 처리시간 0.027초

시각을 이용한 이동 로봇의 강건한 경로선 추종 주행 (Vision-Based Mobile Robot Navigation by Robust Path Line Tracking)

  • 손민혁;도용태
    • 센서학회지
    • /
    • 제20권3호
    • /
    • pp.178-186
    • /
    • 2011
  • Line tracking is a well defined method of mobile robot navigation. It is simple in concept, technically easy to implement, and already employed in many industrial sites. Among several different line tracking methods, magnetic sensing is widely used in practice. In comparison, vision-based tracking is less popular due mainly to its sensitivity to surrounding conditions such as brightness and floor characteristics although vision is the most powerful robotic sensing capability. In this paper, a vision-based robust path line detection technique is proposed for the navigation of a mobile robot assuming uncontrollable surrounding conditions. The technique proposed has four processing steps; color space transformation, pixel-level line sensing, block-level line sensing, and robot navigation control. This technique effectively uses hue and saturation color values in the line sensing so to be insensitive to the brightness variation. Line finding in block-level makes not only the technique immune from the error of line pixel detection but also the robot control easy. The proposed technique was tested with a real mobile robot and proved its effectiveness.

A Vision-Based Method to Find Fingertips in a Closed Hand

  • Chaudhary, Ankit;Vatwani, Kapil;Agrawal, Tushar;Raheja, J.L.
    • Journal of Information Processing Systems
    • /
    • 제8권3호
    • /
    • pp.399-408
    • /
    • 2012
  • Hand gesture recognition is an important area of research in the field of Human Computer Interaction (HCI). The geometric attributes of the hand play an important role in hand shape reconstruction and gesture recognition. That said, fingertips are one of the important attributes for the detection of hand gestures and can provide valuable information from hand images. Many methods are available in scientific literature for fingertips detection with an open hand but very poor results are available for fingertips detection when the hand is closed. This paper presents a new method for the detection of fingertips in a closed hand using the corner detection method and an advanced edge detection algorithm. It is important to note that the skin color segmentation methodology did not work for fingertips detection in a closed hand. Thus the proposed method applied Gabor filter techniques for the detection of edges and then applied the corner detection algorithm for the detection of fingertips through the edges. To check the accuracy of the method, this method was tested on a vast number of images taken with a webcam. The method resulted in a higher accuracy rate of detections from the images. The method was further implemented on video for testing its validity on real time image capturing. These closed hand fingertips detection would help in controlling an electro-mechanical robotic hand via hand gesture in a natural way.

A Study on the Implementation of RFID-based Autonomous Navigation System for Robotic Cellular Phone(RCP)

  • Choe, Jae-Il;Choi, Jung-Wook;Oh, Dong-Ik;Kim, Seung-Woo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.457-462
    • /
    • 2005
  • Industrial and economical importance of CP(Cellular Phone) is growing rapidly. Combined with IT technology, CP is currently one of the most attractive technologies for all. However, unless we find a breakthrough to the technology, its growth may slow down soon. RT(Robot Technology) is considered one of the most promising next generation technology. Unlike the industrial robot of the past, today's robots require advanced technologies, such as soft computing, human-friendly interface, interaction technique, speech recognition, object recognition, and many others. In this study, we present a new technological concept named RCP(Robotic Cellular Phone), which combines RT & CP, in the vision of opening a new direction to the advance of CP, IT, and RT all together. RCP consists of 3 sub-modules. They are $RCP^{Mobility}$, $RCP^{Interaction}$, and $RCP^{Interaction}$. $RCP^{Mobility}$ is the main focus of this paper. It is an autonomous navigation system that combines RT mobility with CP. Through $RCP^{Mobility}$, we should be able to provide CP with robotic functionalities such as auto-charging and real-world robotic entertainments. Eventually, CP may become a robotic pet to the human being. $RCP^{Mobility}$ consists of various controllers. Two of the main controllers are trajectory controller and self-localization controller. While Trajectory Controller is responsible for the wheel-based navigation of RCP, Self-Localization Controller provides localization information of the moving RCP. With the coordinate information acquired from RFID-based self-localization controller, Trajectory Controller refines RCP's movement to achieve better RCP navigations. In this paper, a prototype system we developed for $RCP^{Mobility}$ is presented. We describe overall structure of the system and provide experimental results of the RCP navigation.

  • PDF

RFID를 이용한 RCP 자율 네비게이션 시스템 구현을 위한 연구 (A Study on the Implementation of RFID-Based Autonomous Navigation System for Robotic Cellular Phone (RCP))

  • 최재일;최정욱;오동익;김승우
    • 제어로봇시스템학회논문지
    • /
    • 제12권5호
    • /
    • pp.480-488
    • /
    • 2006
  • Industrial and economical importance of CP(Cellular Phone) is growing rapidly. Combined with IT technology, CP is one of the most attractive technologies of today. However, unless we find a new breakthrough in the technology, its growth may slow down soon. RT(Robot Technology) is considered one of the most promising next generation technologies. Unlike the industrial robot of the past, today's robots require advanced features, such as soft computing, human-friendly interface, interaction technique, speech recognition object recognition, among many others. In this paper, we present a new technological concept named RCP (Robotic Cellular Phone) which integrates RT and CP in the vision of opening a combined advancement of CP, IT, and RT, RCP consists of 3 sub-modules. They are $RCP^{Mobility}$(RCP Mobility System), $RCP^{Interaction}$, and $RCP^{Integration}$. The main focus of this paper is on $RCP^{Mobility}$ which combines an autonomous navigation system of the RT mobility with CP. Through $RCP^{Mobility}$, we are able to provide CP with robotic functions such as auto-charging and real-world robotic entertainment. Ultimately, CP may become a robotic pet to the human beings. $RCP^{Mobility}$ consists of various controllers. Two of the main controllers are trajectory controller and self-localization controller. While the former is responsible for the wheel-based navigation of RCP, the latter provides localization information of the moving RCP With the coordinates acquired from RFID-based self-localization controller, trajectory controller refines RCP's movement to achieve better navigation. In this paper, a prototype of $RCP^{Mobility}$ is presented. We describe overall structure of the system and provide experimental results on the RCP navigation.

발 움직임 검출을 통한 로봇 팔 제어에 관한 연구 (A Study on Robot Arm Control System using Detection of Foot Movement)

  • 지훈;이동훈
    • 재활복지공학회논문지
    • /
    • 제9권1호
    • /
    • pp.67-72
    • /
    • 2015
  • 팔의 사용이 자유롭지 못한 장애인들을 위하여 발의 움직임 검출을 통하여 로봇 팔을 제어할 수 있는 시스템을 구현하였다. 발의 움직임에 대한 영상을 얻기 위하여 양쪽 발 앞에 두 대의 카메라를 설치하였으며, 획득된 영상에 대해 LabView 기반 Vision Assistant를 이용하여 다중 관심영역을 설정한 후, 좌/우영역내에서 검출된 좌/우, 상/하 엣지를 기반으로 발의 움직임을 검출하였다. 좌/우 두발의 영상으로부터 좌/우 엣지와 상/하 엣지 검출 수에 따라 6관절 로봇 팔을 제어할 수 있는 제어용 데이터를 시리얼 통신을 통해 전송한 후 로봇 팔을 발로 상/하, 좌/우 제어할 수 있는 시스템을 구현하였다. 실험 결과 0.5초 이내의 반응속도와 88% 이상의 동작 인식률을 얻을 수 있었다.

  • PDF

Intelligent systems for control

  • Erickson, Jon D.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1996년도 Proceedings of the Korea Automatic Control Conference, 11th (KACC); Pohang, Korea; 24-26 Oct. 1996
    • /
    • pp.4-12
    • /
    • 1996
  • This keynote presentation covers the subject of intelligent systems development for monitoring and control in various NASA space applications. Similar intelligent systems technology also has applications in terrestrial commercial applications. Discussion will be given of the general approach of intelligent systems and description given of intelligent systems under prototype development for possible use in Space Shuttle Upgrade, in the Experimental Crew Return. Vehicle, and in free-flying space robotic cameras to provide autonomy to these spacecraft with flexible human intervention, if desired or needed. Development of intelligent system monitoring and control for regenerative life support subsystems such as NASA's human rated Bio-PLEX test facility is also described. A video showing two recent world's firsts in real-time vision-guided robotic arm and hand grasping of tumbling and translating complex shaped objects in micro-gravity will also be shown.

  • PDF

로봇과 인간의 상호작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법 (Recognition and Generation of Facial Expression for Human-Robot Interaction)

  • 정성욱;김도윤;정명진;김도형
    • 제어로봇시스템학회논문지
    • /
    • 제12권3호
    • /
    • pp.255-263
    • /
    • 2006
  • In the last decade, face analysis, e.g. face detection, face recognition, facial expression recognition, is a very lively and expanding research field. As computer animated agents and robots bring a social dimension to human computer interaction, interest in this research field is increasing rapidly. In this paper, we introduce an artificial emotion mimic system which can recognize human facial expressions and also generate the recognized facial expression. In order to recognize human facial expression in real-time, we propose a facial expression classification method that is performed by weak classifiers obtained by using new rectangular feature types. In addition, we make the artificial facial expression using the developed robotic system based on biological observation. Finally, experimental results of facial expression recognition and generation are shown for the validity of our robotic system.

비전을 활용한 사람을 따라다니는 로봇의 실내측위에 관한 연구 (The Study on Indoor Localization for Robots following Human using Vision Applications)

  • 전봉기
    • 한국정보통신학회논문지
    • /
    • 제17권6호
    • /
    • pp.1370-1374
    • /
    • 2013
  • 주인을 스스로 따라다니는 로봇 캐리어가 등장하여 화제가 되었다. 최근에는 사람을 따라다니는 청소기가 로봇청소기가 출시되었다. 로봇 휠체어 등 주인이나 빛을 인식하여 따라다니는 로봇들이 다양한 응용에서 사용되고 있다. 본 연구에서는 로봇이 물건을 싣고 주인을 따라다니는 로봇을 개발하는 과정에서 로봇의 귀환문제를 다루고자 한다. 이 논문에서는 영상처리기술을 이용하여 로봇의 위치를 파악할 수 있는 실내 측위 방법을 제안한다.

적외선 카메라를 이용한 용접비드를 제어하기 위한 알고리즘 개발 (Development of an algorithm for Controlling Welding Bead Using Infrared Thermography)

  • 김일수;박창언;손준식;박순영;정영재
    • Journal of Welding and Joining
    • /
    • 제18권6호
    • /
    • pp.55-61
    • /
    • 2000
  • Dynamic monitoring of weld pool formation and seam deviations using infrared vision is described in this paper. Isothermal contours representing heat dissipation characteristics during the process of arc welding were analysed and processed using imaging techniques. Maximum bead width and penetration were recorded and the geometric position in relation to the welding seam was measured at each sampling point. Deviations from the desired bead geometry and welding path were sensed and their thermographic representations were digitised and welding path were sensed and their thermographic representations were digitised and subsequently identified. Evidence suggested that infrared thermography can be utilized to compensate for inaccuracies encountered in real-time during robotic arc welding.

  • PDF

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제6권4호
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.