• Title/Summary/Keyword: 동작 궤적

Search Result 94, Processing Time 0.029 seconds

Learning and Generation of Motion Trajectory in a Humanoid Robot (인간형 로봇의 동작궤적 학습 및 생성)

  • 진영규;사공준;최진영
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.05a
    • /
    • pp.131-135
    • /
    • 2001
  • 본 논문에서는 작업 변수 또는 동작의 의도에 따라 다양한 형태의 궤적을 생성할 수 있는 동적 궤적 메모리(MTM)와 로봇 관절의 속도 및 가속도 제약조건을 만족하는 동작 시간을 계산하는 방법인 제약 조건을 고려한 표본화 간격 계산법(STICCON)이라는 두 가지 방법을 제시한다. 그리고 그 방법은 인간형 로봇의 동작 궤적 생성을 위한 구조적인 틀을 제안한다. 제안된 방법은 인간형 로봇의 궤적 생성 방법이 가져야 하는 두 가지 특성, 즉 복잡하고 다양한 동작의 궤적 표현 능력과 제약 조건에 따른 동적 궤적의 변형 과정을 모두 가지고 있다.

  • PDF

Searching Human Motion Data by Sketching 3D Trajectories (3차원 이동 궤적 묘사를 통한 인간 동작 데이터 검색)

  • Lee, Kang Hoon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.19 no.2
    • /
    • pp.1-8
    • /
    • 2013
  • Captured human motion data has been widely utilized for understanding the mechanism of human motion and synthesizing the animation of virtual characters. Searching for desired motions from given motion data is an important prerequisite of analyzing and editing those selected motions. This paper presents a new method of content-based motion retrieval without the need of additional metadata such as keywords. While existing search methods have focused on skeletal configurations of body pose or planar trajectories of locomotion, our method receives a three-dimensional trajectory as its input query and retrieves a set of motion intervals in which the trajectories of body parts such as hands, foods, and pelvis are similar to the input trajectory. In order to allow the user to intuitively sketch spatial trajectories, we used the Leap Motion controller that can precisely trace finger movements as the input device for our experiments. We have evaluated the effectiveness of our approach by conducting a user study in which the users search for dozens of pre-selected motions from baseketball motion data including a variety of moves such as dribbling and shooting.

Trajectory Generation for a Biped Robot Using ELIPM (ELIPM을 이용한 이족보행로봇의 궤적생성)

  • Park, Goun-Woo;Choi, See-Myoung;Park, Jong-Hyeon
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.35 no.7
    • /
    • pp.767-772
    • /
    • 2011
  • Trajectory generation is important because it determines the walking stability, continuity, and performance of a body in motion. Generally, the Linear Inverted Pendulum Mode is used for trajectory generation; however, for the sake of simplicity, the trajectory in this mode does not allow vertical motions and pitching motions of the body. This paper proposes a new trajectory generation method called Extended Linear Inverted Pendulum Mode (ELIPM) that allows vertical motion as well as pitching motion. This method can also improve the performance of locomotion by controlling the stride and locomotion frequency of a body.

Vision-Based Trajectory Tracking Control System for a Quadrotor-Type UAV in Indoor Environment (실내 환경에서의 쿼드로터형 무인 비행체를 위한 비전 기반의 궤적 추종 제어 시스템)

  • Shi, Hyoseok;Park, Hyun;Kim, Heon-Hui;Park, Kwang-Hyun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.39C no.1
    • /
    • pp.47-59
    • /
    • 2014
  • This paper deals with a vision-based trajectory tracking control system for a quadrotor-type UAV for entertainment purpose in indoor environment. In contrast to outdoor flights that emphasize the autonomy to complete special missions such as aerial photographs and reconnaissance, indoor flights for entertainment require trajectory following and hovering skills especially in precision and stability of performance. This paper proposes a trajectory tracking control system consisting of a motion generation module, a pose estimation module, and a trajectory tracking module. The motion generation module generates a sequence of motions that are specified by 3-D locations at each sampling time. In the pose estimation module, 3-D position and orientation information of a quadrotor is estimated by recognizing a circular ring pattern installed on the vehicle. The trajectory tracking module controls the 3-D position of a quadrotor in real time using the information from the motion generation module and pose estimation module. The proposed system is tested through several experiments in view of one-point, multi-points, and trajectory tracking control.

ROS-based Pick-and-Place Motion Control for a Robot Arm of 4 Degrees of Freedom (자유도-4 로봇 팔을 위한 ROS 기반 Pick-and-Place 동작 제어)

  • Kim, Young-Ju
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2018.01a
    • /
    • pp.53-54
    • /
    • 2018
  • 본 논문은 ROS 프레임워크를 기반으로 4-자유도를 가진 로봇 팔의 Pick-and-Place 동작 제어를 구현하고, 틱택토 게임에 적용한 사례를 제시한다. 로봇 팔의 Pick-and-Place 동작 제어는 움직임 궤적 계획, 충돌 회피 그리고 역기구학 모델링 연산들과 이를 이용한 복잡한 제어 과정을 요구한다. ROS 프레임워크는 간단한 인터페이스 통해 로봇 팔의 동작을 용이하게 제어할 수 있도록 일련의 연산들과 제어 동작을 통합하여 MoveIt 패키지를 제공하고 있으며, 본 논문은 이 패키지를 기반으로 4-자유도의 로봇 팔에 대한 동작 제어 모듈을 구현하였다. 또한 이를 틱택토 게임에 적용하여 로봇 팔을 적절히 제어함을 확인하였다.

  • PDF

Gesture Input System in 3-D Space by Using Inertial Sensors (관성 센서를 이용한 공간상의 제스처 입력 시스템)

  • Cho, Sung-Jung;Bang, Won-Chul;Chang, Wook;Choi, Eun-Seok;Yang, Jing;Oh, Jong-Gu;Kang, Kyung-Ho;Cho, Joon-Kee;Kim, Dong-Yoon
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.04b
    • /
    • pp.709-711
    • /
    • 2004
  • 본 논문은 3차원 상에서 사용자의 동작을 관성센서로 입력받아 제스처를 인식하는 시스템을 소개한다. 사용자가 취한 제스처 동작은 관성 센서를 통하여 각속도 및 가속도 신호열로 변환된다. 궤적 추정 알고리즘은 이를 2차원 상의 동작 궤적으로 변환한다. 인식 알고리즘은 이 동작 궤적을 입력받아 베이지안 네트웍에 기반한 제스처 모델들로부터의 likelihood를 계산한 후, 최대 likelihood를 갖는 모델을 선택하여 인식을 수행한다. 16명의 필자로부터 13개의 제스처 동작을 각 24회씩 수집하여 실험한 결과 평균 99.4%의 인식률을 얻었다.

  • PDF

Near-Minimum Time Trajectory Planning of Two Robots with Collision Avoidance (두 대의 로봇의 근사 최소시간 제어를 위한 충돌회피 궤적 계획)

  • Lee, Dong-Soo;Chong, Nak-Young;Suh, Il-Hong;Choi, Dong-Hoon;Lyou, Joon
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.15 no.5
    • /
    • pp.1495-1502
    • /
    • 1991
  • 본 연구에서는 동일 작업 공간내에서 두대의 로봇이 각각의 토크의 제한 조건 과 충돌 회피 조건을 만족하면서 근사 최소 시간에 지정된 경로를 주행하기 위한 궤적 계획법을 제안하고자 한다. 이때, 동작 우선도에 의하여 한 대의 로봇은 주 로봇, 다른 한 대의 로봇은 종 로봇으로 지정되는데 주 로봇은 입력 토크의 제한조건을 만족 하며 주어진 경로를 최소 시간에 움직이도록 궤적 계획을 하였으며, 종 로봇은 주 로 봇과의 충돌을 피하고 입력 토크의 제한 조건을 만족하며 주어진 경로를 근사 최소 시 간에 움직이도록 하였다.

Remote Control of Network-Based Modular Robot (네트웍 기반 모듈라 로봇의 원격 제어)

  • Yeom, Dong-Joo;Lee, Bo-Hee
    • Journal of Convergence for Information Technology
    • /
    • v.8 no.5
    • /
    • pp.77-83
    • /
    • 2018
  • A modular robot that memorizes motion can be easily created and operated because it expresses by hand. However, since there is not enough storage space in the module to store the user-created operation, it is impossible to reuse the created operation, and when the modular robot again memorizes the operation, it changes to another operation. There is no main controller capable of operating a plurality of modular robots at the same time, and thus there is a disadvantage that the user must input directly to the modular robot. To overcome these disadvantages, a remote controller has been proposed that can be operated in the surrounding smart devices by designing web server and component based software using wired and wireless network. In the proposed method, various types of structures are created by connecting to a modular robot, and the reconstructed operation is performed again after storing, and the usefulness is confirmed by regenerating the stored operation effectively. In addition, the reliability of the downloaded trajectory data is verified by analyzing the difference between the trajectory data and the actual trajectory. In the future, the trajectory stored in the remote controller will be standardized using the artificial intelligence technique, so that the operation of the modular robot will be easily implemented.

Noise-Robust Capturing and Animating Facial Expression by Using an Optical Motion Capture System (광학식 동작 포착 장비를 이용한 노이즈에 강건한 얼굴 애니메이션 제작)

  • Park, Sang-Il
    • Journal of Korea Game Society
    • /
    • v.10 no.5
    • /
    • pp.103-113
    • /
    • 2010
  • In this paper, we present a practical method for generating facial animation by using an optical motion capture system. In our setup, we assumed a situation of capturing the body motion and the facial expression simultaneously, which degrades the quality of the captured marker data. To overcome this problem, we provide an integrated framework based on the local coordinate system of each marker for labeling the marker data, hole-filling and removing noises. We justify the method by applying it to generate a short animated film.

On-line Motion Synthesis Using Analytically Differentiable System Dynamics (분석적으로 미분 가능한 시스템 동역학을 이용한 온라인 동작 합성 기법)

  • Han, Daseong;Noh, Junyong;Shin, Joseph S.
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.3
    • /
    • pp.133-142
    • /
    • 2019
  • In physics-based character animation, trajectory optimization has been widely adopted for automatic motion synthesis, through the prediction of an optimal sequence of future states of the character based on its system dynamics model. In general, the system dynamics model is neither in a closed form nor differentiable when it handles the contact dynamics between a character and the environment with rigid body collisions. Employing smoothed contact dynamics, researchers have suggested efficient trajectory optimization techniques based on numerical differentiation of the resulting system dynamics. However, the numerical derivative of the system dynamics model could be inaccurate unlike its analytical counterpart, which may affect the stability of trajectory optimization. In this paper, we propose a novel method to derive the closed-form derivative for the system dynamics by properly approximating the contact model. Based on the resulting derivatives of the system dynamics model, we also present a model predictive control (MPC)-based motion synthesis framework to robustly control the motion of a biped character according to on-line user input without any example motion data.