• Title/Summary/Keyword: Motion Interface

검색결과 665건 처리시간 0.025초

Analysis of Human Arm Movement During Vehicle Steering Maneuver

  • Tak, Tae-Oh;Kim, Kun-Young;Chun, Hyung-Ho
    • Journal of Mechanical Science and Technology
    • /
    • 제19권spc1호
    • /
    • pp.444-451
    • /
    • 2005
  • The analysis of human arm motion during steering maneuver is carried out for investigation of man-machine interface of driver and steering system Each arm is modeled as interconnection of upper arm, lower arm, and hand by rotational joints that can properly represents permissible joint motion, and both arms are connected to a steering wheel through spring and damper at the contact points. The joint motion law during steering motion is determined through the measurement of each arm movement, and subsequent inverse kinematic analysis. Combining the joint motion law and inverse dynamic analysis, joint stiffness of arm is estimated. Arm dynamic analysis model for steering maneuver is setup, and is validated through the comparison with experimentally measured data, which shows relatively good agreement. To demonstrate the usefulness of the arm model, it is applied to study the effect of steering column angle on the steering motion.

Study on User Interface for a Capacitive-Sensor Based Smart Device

  • Jung, Sun-IL;Kim, Young-Chul
    • 스마트미디어저널
    • /
    • 제8권3호
    • /
    • pp.47-52
    • /
    • 2019
  • In this paper, we designed HW / SW interfaces for processing the signals of capacitive sensors like Electric Potential Sensor (EPS) to detect the surrounding electric field disturbance as feature signals in motion recognition systems. We implemented a smart light control system with those interfaces. In the system, the on/off switch and brightness adjustment are controlled by hand gestures using the designed and fabricated interface circuits. PWM (Pulse Width Modulation) signals of the controller with a driver IC are used to drive the LED and to control the brightness and on/off operation. Using the hand-gesture signals obtained through EPS sensors and the interface HW/SW, we can not only construct a gesture instructing system but also accomplish the faster recognition speed by developing dedicated interface hardware including control circuitry. Finally, using the proposed hand-gesture recognition and signal processing methods, the light control module was also designed and implemented. The experimental result shows that the smart light control system can control the LED module properly by accurate motion detection and gesture classification.

외상 후 동결견 (Posttraumatic Stiffness)

  • 최창혁
    • Clinics in Shoulder and Elbow
    • /
    • 제9권1호
    • /
    • pp.14-19
    • /
    • 2006
  • The patient with a posttraumatic stiffness frequently has a history of prolonged immobilization after a traumatic event. Adhesions in the extraarticular humeroscapular motion interface may be present independently or in combination with intraarticular capsular contractures. A through history and physical examination usually reveal the cause and anatomic location of stiffness. Passive stretching exercise program is effective as a first line treatment, but manipulation under anesthesia is usually not effective because of potential complication such as fracture, tendon rupture and neurologic injury. The humeroscapular motion interface adhesion can be released either open or arthroscopically. The combined technique coupled with an aggressive rehabilitation program can provide more effective motion restoration and pain relief.

전자기기 사용이 불편한 장애인이나 노인들을 위한 새로운 인터페이스에 대한 연구 (A study of new interface system for the disabled and old people who do not well using electronic equipment)

  • 정성부;김주웅
    • 한국정보통신학회논문지
    • /
    • 제16권12호
    • /
    • pp.2595-2600
    • /
    • 2012
  • 본 연구에서는 전자기기의 사용이 불편한 장애인이나 노인들을 위한 새로운 인터페이스를 제안한다. 제안한 방식은 대부분의 전자기기에 적용되고 있는 물리적 스위치 방식의 인터페이스를 마이크와 3축 가속도 센서가 달린 헤드셋을 이용하여 음성인식과 모션인식을 통해 전자기기를 사용하는 방식이다. 음성인식은 마이크를 통해 명령이 전달되고 음성인식 모듈을 통해 PC의 마우스를 컨트롤한다. 그리고 모션인식은 머리를 움직여 헤드셋에 있는 3축 가속도 센서가 마우스 포인터를 제어하여 명령을 수행하게 된다. 제안한 인터페이스 방식의 유용성을 확인하기 위해 헤드셋을 제작하고, 제작된 헤드셋을 이용하여 실험실과 식당, 도서관 등에서 실험을 한다.

슈팅 게임의 현실감 개선을 위한 립모션 기반 인터페이스 구현 (Implementing Leap-Motion-Based Interface for Enhancing the Realism of Shooter Games)

  • 신인호;천동훈;박한훈
    • 한국HCI학회논문지
    • /
    • 제11권1호
    • /
    • pp.5-10
    • /
    • 2016
  • 본 논문은 립모션을 사용하여 사용자의 손동작을 인식함으로써 보다 현실감 있는 슈팅 게임 조작 방식을 제공한다. 슈팅 게임에서 필수적인 발사, 위치 이동, 시점 변화, 줌 인/아웃 등의 기능을 구현했으며, 사용자 평가를 통해 게임 인터페이스를 친숙하고 직관적인 손동작으로 대체함으로써, 기존 마우스/키보드 대비 조작의 용이성, 흥미, 확장성 등의 측면에서 우수함을 확인하였다. 구체적으로, 마우스/키보드를 이용한 인터페이스의 사용자 만족도(1~5)는 평균 3.02인 반면, 손동작을 이용한 인터페이스는 3.57이었다.

고속 정밀 로봇 제어를 위한 실시간 중앙 집중식 소프트 모션 제어 시스템 (Real-Time Centralized Soft Motion Control System for High Speed and Precision Robot Control)

  • 정일균;김정훈
    • 대한임베디드공학회논문지
    • /
    • 제8권6호
    • /
    • pp.295-301
    • /
    • 2013
  • In this paper, we propose a real-time centralized soft motion control system for high speed and precision robot control. The system engages EtherCAT as high speed industrial motion network to enable force based motion control in real-time and is composed of software-based master controller with PC and slave interface modules. Hard real-time control capacity is essential for high speed and precision robot control. To implement soft based real time control, The soft based master controller is designed using a real time kernel (RTX) and EtherCAT network, and servo processes are located in the master controller for centralized motion control. In the proposed system, slave interface modules just collect and transfer all sensor information of robot to the master controller via the EtherCAT network. It is proven by experimental results that the proposed soft motion control system has real time controllability enough to apply for various robot control systems.

요소를 고려한 키네틱 타이포그래피 시스템의 확장 (Extension of Kinetic Typography System Considering Text Components)

  • 정승아;이다솜;임순범
    • 한국멀티미디어학회논문지
    • /
    • 제20권11호
    • /
    • pp.1828-1841
    • /
    • 2017
  • In the previous research, we proposed a Kinetic typography font engine that can easily add motion to text with function call only. However, since it is aimed at constructing movements for a sentence, there is still inconvenience in the production of various kinetic typography motions in word or letter unit. We propose Kinetic Typical Extended Motion API(Application Programming Interface) that extends Kinetic Motion API. The extended Kinetic Typographic Font Engine aims to simplify the process of making kinetic typography in words and letters, including the kinetic typographic motion library provided as a function. In addition, various applications that can apply Kinetic typography A kinetic typography authoring interface is provided for facilitating the construction of a motion library for the robot.

Implementation of Human Motion Following Robot through Wireless Communication Interface

  • Choi, Hyoukryeol;Jung, Kwangmok;Ryew, SungMoo;Kim, Hunmo;Jeon, Jaewook;Nam, Jaedo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2002년도 ICCAS
    • /
    • pp.36.3-36
    • /
    • 2002
  • $\textbullet$ Motion capture system $\textbullet$ Exoskeleton mechanism $\textbullet$ Kinematics analysis $\textbullet$ Man-machine Interface $\textbullet$ Wireless communication $\textbullet$ Control algorithm

  • PDF

스마트 폰 추적 및 색상 통신을 이용한 동작인식 플랫폼 개발 (Development of Motion Recognition Platform Using Smart-Phone Tracking and Color Communication)

  • 오병훈
    • 한국인터넷방송통신학회논문지
    • /
    • 제17권5호
    • /
    • pp.143-150
    • /
    • 2017
  • 본 논문에서는 스마트 폰 추적 및 색상 통신을 이용한 새로운 동작인식 플랫폼을 개발한다. 카메라가 탑재된 PC 혹은 스마트 TV와 개인 스마트 폰 가지고 영상을 기반으로 한 객체 인식 기술을 이용하여 동작 인식 유저 인터페이스를 제공한다. 사용자는 손으로 스마트 폰을 움직여 모션 컨트롤러처럼 사용할 수 있으며, 플랫폼에서는 이 스마트폰을 실시간으로 검출하고, 3차원 거리와 각도를 추정하여 사용자의 동작을 인식한다. 또한, 스마트 폰과 서버의 통신을 위하여 색상 디지털 코드를 이용한 통신 시스템이 사용된다. 사용자들은 색상 통신 방법을 이용하여 텍스트 데이터를 자유자재로 주고받을 수 있으며, 동작을 취하는 도중에도 끊임없이 데이터를 전송할 수 있다. 제안한 동작인식 플랫폼 기반의 실행 가능한 콘텐츠를 구현하여 결과를 제시한다.

Kinematics Analysis and Implementation of a Motion-Following Task for a Humanoid Slave Robot Controlled by an Exoskeleton Master Robot

  • Song, Deok-Hui;Lee, Woon-Kyu;Jung, Seul
    • International Journal of Control, Automation, and Systems
    • /
    • 제5권6호
    • /
    • pp.681-690
    • /
    • 2007
  • This article presents the kinematic analysis and implementation of an interface and control of two robots-an exoskeleton master robot and a human-like slave robot with two arms. Two robots are designed and built to be used for motion-following tasks. The operator wears the exoskeleton master robot to generate motions, and the slave robot is required to follow after the motion of the master robot. To synchronize the motions of two robots, kinematic analysis is performed to correct the kinematic mismatch between two robots. Hardware implementation of interface and control is done to test motion-following tasks. Experiments are performed to confirm the feasibility of the motion-following tasks by two robots.