• Title/Summary/Keyword: Myo Armband

Search Result 6, Processing Time 0.023 seconds

An Introduction of Myo Armband and Its Comparison with Motion Capture Systems

  • Cho, Junghun;Lee, Jang Hyung;Kim, Kwang Gi
    • Journal of Multimedia Information System
    • /
    • v.5 no.2
    • /
    • pp.115-120
    • /
    • 2018
  • Recently, ways for accurately measuring the three dimensional movements of hand are actively researched so as to utilize the measurement data for therapeutic and rehabilitation programs. This research paper aims to introduce a product called Myo Armband, a wearable device comprised of a 3-axis accelerometer, a 3 axis gyroscope, and electromyographic sensors. We compare Armband's performance with that of the Motion Capture System, which is known as a device for providing fairly accurate measurements for angular movements of objects. Dart throwing and wrist winding motions comprised movement scenarios. This paper also discusses one of Armband's advantages - portability, and suggests its potential as a substitute for previously used devices. Decent levels of measurement accuracy were obtained which were comparable to that of three dimensional measurement device.

Teleoperation Control of ROS-based Industrial Robot Using EMG Signals (근전도센서를 이용한 ROS기반의 산업용 로봇 원격제어)

  • Jeon, Se-Yun;Park, Bum Yong
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.15 no.2
    • /
    • pp.87-94
    • /
    • 2020
  • This paper introduces a method to control an industrial robot arm to imitate the movement of the human arm and hand using electromyography (EMG) signals. The proposed method is implemented on the UR3 robot that is a popular industrial robot and a MYO armband that measure the EMG signals generated by human muscles. The communications for the UR3 robot and the MYO armband are integrated in the robot operating system (ROS) that is a middle-ware to develop robot systems easily. The movement of the human arm and hand is detected by the MYO armband, which is utilized to recognize and to estimate the speed of the movement of the operator's arm and the motion of the operator's hand. The proposed system can be easily used when human's detailed movement is required in the environment where human can't work. An experiments have been conducted to verify the performance of the proposed method using the teleoperation of the UR3 robot.

Autonomous Mobile Robot Control using the Wearable Devices Based on EMG Signal for detecting fire (EMG 신호 기반의 웨어러블 기기를 통한 화재감지 자율 주행 로봇 제어)

  • Kim, Jin-Woo;Lee, Woo-Young;Yu, Je-Hun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.3
    • /
    • pp.176-181
    • /
    • 2016
  • In this paper, the autonomous mobile robot control system for detecting fire was proposed using the wearable device based on EMG(Electromyogram) signal. Myo armband is used for detecting the user's EMG signal. The gesture was classified after sending the data of EMG signal to a computer using Bluetooth communication. Then the robot named 'uBrain' was implemented to move by received data from Bluetooth communication in our experiment. 'Move front', 'Turn right', 'Turn left', and 'Stop' are controllable commands for the robot. And if the robot cannot receive the Bluetooth signal from a user or if a user wants to change manual mode to autonomous mode, the robot was implemented to be in the autonomous mode. The robot flashes the LED when IR sensor detects the fire during moving.

Implementation of Mutual Conversion System between Body Movement and Visual·Auditory Information (신체 움직임-시·청각 정보 상호변환 시스템의 구현)

  • Bae, Myung-Jin;Kim, Sung-Ill
    • Journal of IKEEE
    • /
    • v.22 no.2
    • /
    • pp.362-368
    • /
    • 2018
  • This paper has implemented a mutual conversion system that mutually converts between body motion signals and both visual and auditory signals. The present study is based on intentional synesthesia that can be perceived by learning. The Euler's angle was used in body movements as the output of a wearable armband(Myo). As a muscle sense, roll, pitch and yaw signals were used in this study. As visual and auditory signals, MIDI(Musical Instrument Digital Interface) signals and HSI(Hue, Saturation, Intensity) color model were used respectively. The method of mutual conversion between body motion signals and both visual and auditory signals made it easy to infer by applying one-to-one correspondence. Simulation results showed that input motion signals were compared with output simulation ones using ROS(Root Operation System) and Gazebo which is a 3D simulation tool, to enable the mutual conversion between body motion information and both visual and auditory information.

Multi-sensor based NUI/NUX framework for various interactive applications (다양한 상호작용 어플리케이션을 위한 이종 센서 NUI/NUX 프레임워크)

  • Zhang, Weiqiang;Xi, Yulong;Wen, Mingyun;Cho, Seoungjae;Chae, Jeongsook;Kim, Junoh;Um, Kyhyun;Cho, Kungeun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1077-1078
    • /
    • 2017
  • In this study, we implement a natural user interface/experience framework using multi-sensors: Microsoft Kinect, Leap Motion, and Myo Armband. The framework is designed for customers to use in various types of interactive applications. We integrate the functions of three sensors into an application and provide an interface for customers, who can use it to interact with a computer easily. The framework can track body information in real-time, and accurately recognize the motion of different body parts.

Designing Effective Virtual Training: A Case Study in Maritime Safety

  • Jung, Jinki;Kim, Hongtae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.36 no.5
    • /
    • pp.385-394
    • /
    • 2017
  • Objective: The aim of this study is to investigate how to design effective virtual reality-based training (i.e., virtual training) in maritime safety and to present methods for enhancing interface fidelity by employing immersive interaction and 3D user interface (UI) design. Background: Emerging virtual reality technologies and hardware enable to provide immersive experiences to individuals. There is also a theory that the improvement of fidelity can improve the training efficiency. Such a sense of immersion can be utilized as an element for realizing effective training in the virtual space. Method: As an immersive interaction, we implemented gesture-based interaction using leap motion and Myo armband type sensors. Hand gestures captured from both sensors are used to interact with the virtual appliance in the scenario. The proposed 3D UI design is employed to visualize appropriate information for tasks in training. Results: A usability study to evaluate the effectiveness of the proposed method has been carried out. As a result, the usability test of satisfaction, intuitiveness of UI, ease of procedure learning, and equipment understanding showed that virtual training-based exercise was superior to existing training. These improvements were also independent of the type of input devices for virtual training. Conclusion: We have shown through experiments that the proposed interaction design results are more efficient interactions than the existing training method. The improvement of interface fidelity through intuitive and immediate feedback on the input device and the training information improve user satisfaction with the system, as well as training efficiency. Application: Design methods for an effective virtual training system can be applied to other areas by which trainees are required to do sophisticated job with their hands.